Apr 17 17:21:38.954964 ip-10-0-143-59 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 17:21:38.954977 ip-10-0-143-59 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 17:21:38.954986 ip-10-0-143-59 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 17:21:38.955293 ip-10-0-143-59 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 17:21:50.426240 ip-10-0-143-59 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 17:21:50.426259 ip-10-0-143-59 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 1994f6571dfc422897afb59ae553e8cc -- Apr 17 17:24:10.543205 ip-10-0-143-59 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:24:10.972925 ip-10-0-143-59 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:10.972925 ip-10-0-143-59 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:24:10.972925 ip-10-0-143-59 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:10.972925 ip-10-0-143-59 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:24:10.972925 ip-10-0-143-59 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:10.974275 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.974187 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:24:10.978586 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978570 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:10.978586 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978587 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978591 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978595 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978598 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978601 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978604 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978607 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978610 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978613 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978616 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978627 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978630 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978632 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978635 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978637 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978640 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978643 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978645 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978655 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978658 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:10.978651 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978661 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978664 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978667 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978669 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978672 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978674 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978677 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978680 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978684 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978688 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978692 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978695 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978698 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978700 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978703 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978706 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978709 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978711 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978715 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:10.979180 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978719 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978722 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978725 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978728 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978731 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978733 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978736 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978738 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978741 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978743 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978746 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978748 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978750 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978753 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978757 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978760 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978763 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978765 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978768 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978771 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:10.979689 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978773 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978776 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978778 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978781 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978784 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978786 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978789 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978791 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978794 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978797 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978799 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978802 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978804 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978807 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978810 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978818 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978823 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978827 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978830 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:10.980201 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978834 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978850 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978853 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978856 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978859 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978862 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.978865 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979321 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979329 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979332 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979335 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979338 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979341 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979345 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979348 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979351 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979354 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979358 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979360 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979363 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:10.980650 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979366 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979368 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979371 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979374 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979377 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979379 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979382 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979385 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979388 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979391 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979393 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979397 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979401 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979404 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979408 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979411 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979413 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979416 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979418 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:10.981144 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979421 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979424 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979427 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979429 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979432 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979434 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979437 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979439 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979442 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979444 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979447 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979449 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979452 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979455 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979457 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979460 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979462 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979464 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979467 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979469 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:10.981613 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979472 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979474 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979477 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979479 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979482 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979485 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979488 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979491 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979493 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979496 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979498 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979500 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979503 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979506 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979508 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979511 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979514 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979516 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979519 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979521 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:10.982110 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979527 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979530 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979532 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979535 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979537 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979540 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979542 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979544 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979547 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979549 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979552 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979554 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979557 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.979559 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979637 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979651 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979658 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979663 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979668 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979671 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979676 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:24:10.982619 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979681 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979684 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979687 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979691 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979695 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979698 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979701 2565 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979705 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979708 2565 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979711 2565 flags.go:64] FLAG: --cloud-config="" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979714 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979717 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979721 2565 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979724 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979727 2565 flags.go:64] FLAG: --config-dir="" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979730 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979734 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979738 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979741 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979744 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979748 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979752 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979755 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979758 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979761 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:24:10.983168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979764 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979768 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979772 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979775 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979778 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979781 2565 flags.go:64] FLAG: --enable-server="true" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979784 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979789 2565 flags.go:64] FLAG: --event-burst="100" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979792 2565 flags.go:64] FLAG: --event-qps="50" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979795 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979798 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979801 2565 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979805 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979808 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979811 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979814 2565 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979817 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979820 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979823 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979826 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979830 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979833 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979849 2565 flags.go:64] FLAG: --feature-gates="" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979853 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979856 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:24:10.983774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979860 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979863 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979867 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979870 2565 flags.go:64] FLAG: --help="false" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979873 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-143-59.ec2.internal" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979876 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979879 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979882 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979885 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979889 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979892 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979895 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979898 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979902 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979905 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979908 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979911 2565 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979914 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979917 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979920 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979923 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979926 2565 flags.go:64] FLAG: --lock-file="" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979929 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979932 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:24:10.984417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979935 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979940 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979944 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979947 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979950 2565 flags.go:64] FLAG: --logging-format="text" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979953 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979956 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979959 2565 flags.go:64] FLAG: --manifest-url="" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979962 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979967 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979970 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979978 2565 flags.go:64] FLAG: --max-pods="110" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979981 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979984 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979987 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979990 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979993 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979996 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.979999 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980008 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980011 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980015 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980018 2565 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:24:10.985046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980020 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980026 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980029 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980032 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980035 2565 flags.go:64] FLAG: --port="10250" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980038 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980042 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-067f5ee55135e5c9c" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980045 2565 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980048 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980051 2565 flags.go:64] FLAG: --register-node="true" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980053 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980057 2565 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980061 2565 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980064 2565 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980067 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980070 2565 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980073 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980076 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980080 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980082 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980085 2565 flags.go:64] FLAG: --runonce="false" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980090 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980093 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980096 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980099 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980102 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:24:10.985605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980105 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980108 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980112 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980121 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980124 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980127 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980130 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980133 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980136 2565 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980139 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980144 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980147 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980150 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980154 2565 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980157 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980160 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980163 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980166 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980170 2565 flags.go:64] FLAG: --v="2" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980175 2565 flags.go:64] FLAG: --version="false" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980179 2565 flags.go:64] FLAG: --vmodule="" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980183 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.980186 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980284 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:10.986300 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980288 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980291 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980294 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980299 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980301 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980304 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980306 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980309 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980312 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980315 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980317 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980321 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980328 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980331 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980334 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980337 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980340 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980342 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980345 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:10.986899 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980347 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980350 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980353 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980355 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980358 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980360 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980363 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980366 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980368 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980371 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980373 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980376 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980379 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980381 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980383 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980386 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980390 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980393 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980395 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980398 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:10.987375 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980400 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980403 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980406 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980409 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980411 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980414 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980418 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980421 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980424 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980427 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980429 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980432 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980434 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980437 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980439 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980442 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980444 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980447 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980453 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980456 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:10.987892 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980459 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980461 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980464 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980466 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980469 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980471 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980474 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980476 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980480 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980482 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980485 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980487 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980490 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980492 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980495 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980498 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980500 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980503 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980505 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980508 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:10.988382 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980511 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980513 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980516 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980518 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980521 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.980523 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.981264 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.987331 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.987346 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987395 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987400 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987404 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987407 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987410 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987413 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:10.988915 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987416 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987418 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987421 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987424 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987426 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987429 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987431 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987434 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987436 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987439 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987442 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987445 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987448 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987450 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987453 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987456 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987458 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987461 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987464 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987466 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:10.989280 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987469 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987471 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987474 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987477 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987479 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987483 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987486 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987489 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987491 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987494 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987496 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987499 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987501 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987504 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987507 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987511 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987514 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987517 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987520 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:10.989797 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987522 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987525 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987527 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987530 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987532 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987535 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987538 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987542 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987546 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987549 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987551 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987554 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987557 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987559 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987562 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987564 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987567 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987570 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987573 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:10.990284 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987579 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987582 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987585 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987587 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987590 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987593 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987596 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987598 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987601 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987604 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987606 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987609 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987611 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987614 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987616 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987619 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987621 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987624 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987626 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987629 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:10.990826 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987631 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987634 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.987639 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987739 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987743 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987746 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987749 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987752 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987755 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987757 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987759 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987763 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987766 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987770 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987773 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987775 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:10.991338 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987778 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987780 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987783 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987786 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987788 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987791 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987793 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987796 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987798 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987801 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987803 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987807 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987812 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987814 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987817 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987820 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987822 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987825 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987827 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987830 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:10.991739 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987832 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987854 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987858 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987861 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987864 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987866 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987869 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987872 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987875 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987878 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987881 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987884 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987887 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987890 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987893 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987895 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987898 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987901 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987904 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987906 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:10.992226 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987909 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987911 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987914 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987917 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987919 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987922 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987924 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987927 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987930 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987933 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987935 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987938 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987940 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987943 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987945 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987948 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987950 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987953 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987955 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987958 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:10.992699 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987961 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987964 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987966 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987970 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987974 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987977 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987980 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987983 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987986 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987988 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987991 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987993 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:10.987996 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.988001 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:10.993179 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.988611 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:24:10.993524 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.990735 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:24:10.993524 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.991561 2565 server.go:1019] "Starting client certificate rotation" Apr 17 17:24:10.993524 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.991658 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:24:10.993524 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:10.991702 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:24:11.015632 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.015612 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:24:11.020384 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.020367 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:24:11.038577 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.038555 2565 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:24:11.045288 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.045273 2565 log.go:25] "Validated CRI v1 image API" Apr 17 17:24:11.046393 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.046375 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:24:11.046471 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.046410 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:24:11.051706 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.051683 2565 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 93c31a48-717c-4e3e-84f7-970408d71c5f:/dev/nvme0n1p4 a8772b21-877a-4af3-b45c-278fcfac9825:/dev/nvme0n1p3] Apr 17 17:24:11.051777 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.051705 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:24:11.057393 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.057291 2565 manager.go:217] Machine: {Timestamp:2026-04-17 17:24:11.055373937 +0000 UTC m=+0.387991575 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099925 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c8e093860765b674f034b1f5fa1e9 SystemUUID:ec2c8e09-3860-765b-674f-034b1f5fa1e9 BootID:1994f657-1dfc-4228-97af-b59ae553e8cc Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ca:75:91:79:a7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ca:75:91:79:a7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ba:2e:d6:58:88:da Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:24:11.057478 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.057395 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:24:11.057512 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.057475 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:24:11.058788 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.058761 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:24:11.058945 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.058791 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-59.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:24:11.058989 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.058955 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:24:11.058989 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.058963 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:24:11.058989 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.058977 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:24:11.059679 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.059669 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:24:11.060714 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.060704 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:24:11.060819 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.060809 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:24:11.063308 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.063298 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:24:11.063349 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.063312 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:24:11.063349 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.063325 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:24:11.063349 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.063335 2565 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:24:11.063430 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.063353 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:24:11.064420 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.064407 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:24:11.064468 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.064428 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:24:11.067071 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.067057 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:24:11.068272 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.068260 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:24:11.069821 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.069726 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:24:11.069821 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.069760 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:24:11.069821 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.069774 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:24:11.069821 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.069785 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:24:11.069821 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.069796 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:24:11.069821 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.069808 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:24:11.069821 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.069823 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:24:11.070063 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.069850 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:24:11.070063 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.069865 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:24:11.070063 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.069880 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:24:11.070063 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.069915 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:24:11.070063 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.069932 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:24:11.074986 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.074966 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:24:11.075035 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.075016 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-59.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:24:11.075791 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.075780 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:24:11.075825 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.075795 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:24:11.079155 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.079141 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:24:11.079202 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.079181 2565 server.go:1295] "Started kubelet" Apr 17 17:24:11.079293 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.079265 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:24:11.079764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.079275 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:24:11.079812 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.079779 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:24:11.080232 ip-10-0-143-59 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:24:11.081520 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.081500 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:24:11.083073 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.083055 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:24:11.086044 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.086015 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:24:11.086825 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.086463 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:24:11.087564 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.087545 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:24:11.087564 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.087544 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:24:11.087564 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.087566 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:24:11.087746 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.087704 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:24:11.087746 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.087713 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:24:11.087854 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.087750 2565 factory.go:55] Registering systemd factory Apr 17 17:24:11.087854 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.087766 2565 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:24:11.088173 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.088150 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-59.ec2.internal\" not found" Apr 17 17:24:11.088315 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.088209 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-59.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:24:11.088426 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.088216 2565 factory.go:153] Registering CRI-O factory Apr 17 17:24:11.088514 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.088430 2565 factory.go:223] Registration of the crio container factory successfully Apr 17 17:24:11.088514 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.088491 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:24:11.088593 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.088527 2565 factory.go:103] Registering Raw factory Apr 17 17:24:11.088593 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.088541 2565 manager.go:1196] Started watching for new ooms in manager Apr 17 17:24:11.089508 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.088299 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-59.ec2.internal.18a734d130015ad9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-59.ec2.internal,UID:ip-10-0-143-59.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-59.ec2.internal,},FirstTimestamp:2026-04-17 17:24:11.079154393 +0000 UTC m=+0.411772030,LastTimestamp:2026-04-17 17:24:11.079154393 +0000 UTC m=+0.411772030,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-59.ec2.internal,}" Apr 17 17:24:11.089508 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.089302 2565 manager.go:319] Starting recovery of all containers Apr 17 17:24:11.090104 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.090060 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:24:11.094463 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.094435 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-h4pv7" Apr 17 17:24:11.095746 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.095634 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-59.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 17:24:11.095893 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.095861 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 17:24:11.095973 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.095888 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:24:11.096781 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.096764 2565 manager.go:324] Recovery completed Apr 17 17:24:11.101349 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.101336 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:11.103054 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.103040 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-h4pv7" Apr 17 17:24:11.103797 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.103782 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:11.103854 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.103812 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:11.103854 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.103822 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:11.104314 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.104301 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:24:11.104370 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.104315 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:24:11.104370 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.104332 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:24:11.105502 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.105426 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-59.ec2.internal.18a734d13179623d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-59.ec2.internal,UID:ip-10-0-143-59.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-59.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-59.ec2.internal,},FirstTimestamp:2026-04-17 17:24:11.103797821 +0000 UTC m=+0.436415457,LastTimestamp:2026-04-17 17:24:11.103797821 +0000 UTC m=+0.436415457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-59.ec2.internal,}" Apr 17 17:24:11.106471 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.106459 2565 policy_none.go:49] "None policy: Start" Apr 17 17:24:11.106546 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.106474 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:24:11.106546 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.106484 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:24:11.159653 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.145488 2565 manager.go:341] "Starting Device Plugin manager" Apr 17 17:24:11.159653 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.145519 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:24:11.159653 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.145533 2565 server.go:85] "Starting device plugin registration server" Apr 17 17:24:11.159653 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.145970 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:24:11.159653 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.145987 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:24:11.159653 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.146173 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:24:11.159653 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.146250 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:24:11.159653 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.146258 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:24:11.159653 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.147306 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:24:11.159653 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.147415 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-59.ec2.internal\" not found" Apr 17 17:24:11.188832 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.188805 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:24:11.188990 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.188851 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:24:11.188990 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.188873 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:24:11.188990 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.188881 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:24:11.188990 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.188919 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:24:11.192202 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.192183 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:11.247237 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.247155 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:11.248163 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.248144 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:11.248274 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.248175 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:11.248274 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.248185 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:11.248274 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.248208 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.257234 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.257216 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.257330 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.257237 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-59.ec2.internal\": node \"ip-10-0-143-59.ec2.internal\" not found" Apr 17 17:24:11.289751 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.289720 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-143-59.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal"] Apr 17 17:24:11.289896 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.289801 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:11.290317 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.290298 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-59.ec2.internal\" not found" Apr 17 17:24:11.290797 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.290782 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:11.290894 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.290809 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:11.290894 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.290819 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:11.292061 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.292047 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:11.292200 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.292188 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.292234 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.292215 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:11.293073 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.293059 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:11.293073 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.293062 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:11.293174 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.293089 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:11.293174 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.293091 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:11.293174 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.293105 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:11.293174 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.293120 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:11.294185 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.294172 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.294247 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.294199 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:11.294903 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.294885 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:11.295005 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.294916 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:11.295005 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.294929 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:11.319754 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.319727 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-59.ec2.internal\" not found" node="ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.324168 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.324147 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-59.ec2.internal\" not found" node="ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.389146 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.389116 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/38e6c6efcebb2223fd3833916862d318-config\") pod \"kube-apiserver-proxy-ip-10-0-143-59.ec2.internal\" (UID: \"38e6c6efcebb2223fd3833916862d318\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.391203 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.391184 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-59.ec2.internal\" not found" Apr 17 17:24:11.490113 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.490079 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/38e6c6efcebb2223fd3833916862d318-config\") pod \"kube-apiserver-proxy-ip-10-0-143-59.ec2.internal\" (UID: \"38e6c6efcebb2223fd3833916862d318\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.490113 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.490100 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/38e6c6efcebb2223fd3833916862d318-config\") pod \"kube-apiserver-proxy-ip-10-0-143-59.ec2.internal\" (UID: \"38e6c6efcebb2223fd3833916862d318\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.490332 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.490166 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/727c55257da0e2451831c00f9fa5ff7d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal\" (UID: \"727c55257da0e2451831c00f9fa5ff7d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.490332 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.490193 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/727c55257da0e2451831c00f9fa5ff7d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal\" (UID: \"727c55257da0e2451831c00f9fa5ff7d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.492144 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.492125 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-59.ec2.internal\" not found" Apr 17 17:24:11.590878 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.590788 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/727c55257da0e2451831c00f9fa5ff7d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal\" (UID: \"727c55257da0e2451831c00f9fa5ff7d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.590878 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.590831 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/727c55257da0e2451831c00f9fa5ff7d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal\" (UID: \"727c55257da0e2451831c00f9fa5ff7d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.591021 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.590902 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/727c55257da0e2451831c00f9fa5ff7d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal\" (UID: \"727c55257da0e2451831c00f9fa5ff7d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.591021 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.590910 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/727c55257da0e2451831c00f9fa5ff7d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal\" (UID: \"727c55257da0e2451831c00f9fa5ff7d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.592906 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.592886 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-59.ec2.internal\" not found" Apr 17 17:24:11.622120 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.622092 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.626490 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.626476 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" Apr 17 17:24:11.693373 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.693344 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-59.ec2.internal\" not found" Apr 17 17:24:11.793943 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.793917 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-59.ec2.internal\" not found" Apr 17 17:24:11.894470 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.894399 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-59.ec2.internal\" not found" Apr 17 17:24:11.991819 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.991792 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:24:11.992422 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:11.991937 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:11.995008 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:11.994992 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-59.ec2.internal\" not found" Apr 17 17:24:12.046785 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.046755 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:12.063949 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.063917 2565 apiserver.go:52] "Watching apiserver" Apr 17 17:24:12.074314 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.074290 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:24:12.074664 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.074644 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-gnmlr","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4","openshift-image-registry/node-ca-rkk2d","openshift-multus/multus-additional-cni-plugins-2hc4g","openshift-ovn-kubernetes/ovnkube-node-4pk44","openshift-cluster-node-tuning-operator/tuned-5zr7s","openshift-dns/node-resolver-f54jc","openshift-multus/multus-t2kcl","openshift-multus/network-metrics-daemon-clsml","openshift-network-diagnostics/network-check-target-kzg6t","openshift-network-operator/iptables-alerter-m7bkv"] Apr 17 17:24:12.076669 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.076644 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gnmlr" Apr 17 17:24:12.077564 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.077540 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.077642 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.077612 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rkk2d" Apr 17 17:24:12.078589 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.078574 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.079536 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.079520 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.079651 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.079555 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6w5bw\"" Apr 17 17:24:12.079651 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.079557 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:24:12.079763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.079746 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:24:12.079824 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.079811 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:24:12.080181 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.080168 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:24:12.081657 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.080820 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.081657 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.080951 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:24:12.081657 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.080983 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6fv27\"" Apr 17 17:24:12.081657 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.081092 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:24:12.081657 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.081482 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:24:12.081972 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.081714 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:24:12.081972 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.081804 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:24:12.082162 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.082129 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:24:12.082394 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.082373 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:24:12.083373 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.082511 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:24:12.083373 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.082600 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-c4bww\"" Apr 17 17:24:12.083373 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.082614 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-z4hj5\"" Apr 17 17:24:12.083373 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.082635 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:24:12.083373 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.082866 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-r5w7h\"" Apr 17 17:24:12.083373 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.082933 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:24:12.083373 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.083032 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:24:12.083373 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.083257 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:24:12.083761 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.083447 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:24:12.083761 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.083600 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:12.083761 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.083642 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:24:12.083761 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.083690 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:24:12.084974 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.084954 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f54jc" Apr 17 17:24:12.085120 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.084988 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:12.085120 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.085057 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.085345 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.085314 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jwdsh\"" Apr 17 17:24:12.086225 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.086209 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:24:12.086363 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.086347 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:12.086499 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.086476 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:12.087356 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.087341 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-59.ec2.internal" Apr 17 17:24:12.087470 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.087452 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:12.087551 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.087507 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:12.088582 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.088565 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m7bkv" Apr 17 17:24:12.093017 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.092993 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-slash\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.093088 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093036 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-cni-bin\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.093088 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093063 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-run-systemd\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.093168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093102 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27fz2\" (UniqueName: \"kubernetes.io/projected/1ac6ac62-b297-4aad-a58f-12c981987869-kube-api-access-27fz2\") pod \"node-resolver-f54jc\" (UID: \"1ac6ac62-b297-4aad-a58f-12c981987869\") " pod="openshift-dns/node-resolver-f54jc" Apr 17 17:24:12.093168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093127 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-run-k8s-cni-cncf-io\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.093168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093154 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2jv2\" (UniqueName: \"kubernetes.io/projected/913b5ebf-9596-434b-8ff9-ccc478779f3c-kube-api-access-q2jv2\") pod \"iptables-alerter-m7bkv\" (UID: \"913b5ebf-9596-434b-8ff9-ccc478779f3c\") " pod="openshift-network-operator/iptables-alerter-m7bkv" Apr 17 17:24:12.093272 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093183 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-tmp\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.093272 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093208 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0680fa77-380e-4eb4-acfb-45207e761c9b-env-overrides\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.093272 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093229 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0680fa77-380e-4eb4-acfb-45207e761c9b-ovn-node-metrics-cert\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.093272 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093251 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-socket-dir\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.093451 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093283 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22fe4e50-13fd-4ae5-b9a6-1552184b400f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.093451 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093313 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/22fe4e50-13fd-4ae5-b9a6-1552184b400f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.093451 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093340 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-kubernetes\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.093451 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093387 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ll79\" (UniqueName: \"kubernetes.io/projected/a04f786b-b1fc-4078-9d22-1b263b785992-kube-api-access-6ll79\") pod \"node-ca-rkk2d\" (UID: \"a04f786b-b1fc-4078-9d22-1b263b785992\") " pod="openshift-image-registry/node-ca-rkk2d" Apr 17 17:24:12.093451 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093422 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-kubelet\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.093451 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093445 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-run-openvswitch\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.093763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093466 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0680fa77-380e-4eb4-acfb-45207e761c9b-ovnkube-config\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.093763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093488 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-var-lib-openvswitch\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.093763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093511 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-system-cni-dir\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.093763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093542 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-run-netns\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.093763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093582 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-run-multus-certs\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.093763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093609 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-etc-kubernetes\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.093763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093656 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/22fe4e50-13fd-4ae5-b9a6-1552184b400f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.093763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093686 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-sysconfig\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.093763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093712 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-sysctl-d\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.093763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093735 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.093763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093759 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbltx\" (UniqueName: \"kubernetes.io/projected/0680fa77-380e-4eb4-acfb-45207e761c9b-kube-api-access-sbltx\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093791 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-multus-socket-dir-parent\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093815 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-var-lib-cni-multus\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093859 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/913b5ebf-9596-434b-8ff9-ccc478779f3c-host-slash\") pod \"iptables-alerter-m7bkv\" (UID: \"913b5ebf-9596-434b-8ff9-ccc478779f3c\") " pod="openshift-network-operator/iptables-alerter-m7bkv" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093890 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-cni-netd\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093913 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-etc-selinux\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093934 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-systemd\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093954 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-sys\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093974 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-lib-modules\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.093996 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-systemd-units\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094057 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86drl\" (UniqueName: \"kubernetes.io/projected/03e6b129-19be-4d65-adaa-3b9d0e336aba-kube-api-access-86drl\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094079 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22fe4e50-13fd-4ae5-b9a6-1552184b400f-cnibin\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094101 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-host\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094123 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094146 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwbwm\" (UniqueName: \"kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm\") pod \"network-check-target-kzg6t\" (UID: \"cee37054-a7f8-4e3d-8733-e19de26444e7\") " pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094179 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a04f786b-b1fc-4078-9d22-1b263b785992-serviceca\") pod \"node-ca-rkk2d\" (UID: \"a04f786b-b1fc-4078-9d22-1b263b785992\") " pod="openshift-image-registry/node-ca-rkk2d" Apr 17 17:24:12.094316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094203 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-tuned\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094226 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsbgt\" (UniqueName: \"kubernetes.io/projected/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-kube-api-access-lsbgt\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094260 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-etc-openvswitch\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094280 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0680fa77-380e-4eb4-acfb-45207e761c9b-ovnkube-script-lib\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094295 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7c713c7a-430f-48a4-9274-ca5da277991e-cni-binary-copy\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094310 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cf0e9c4d-2f86-4ecc-8ccf-c91544212af9-agent-certs\") pod \"konnectivity-agent-gnmlr\" (UID: \"cf0e9c4d-2f86-4ecc-8ccf-c91544212af9\") " pod="kube-system/konnectivity-agent-gnmlr" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094335 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-registration-dir\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094355 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-device-dir\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094376 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-sys-fs\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094398 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22fe4e50-13fd-4ae5-b9a6-1552184b400f-cni-binary-copy\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094414 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a04f786b-b1fc-4078-9d22-1b263b785992-host\") pod \"node-ca-rkk2d\" (UID: \"a04f786b-b1fc-4078-9d22-1b263b785992\") " pod="openshift-image-registry/node-ca-rkk2d" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094428 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-node-log\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094446 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094491 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094515 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-cnibin\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094530 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7c713c7a-430f-48a4-9274-ca5da277991e-multus-daemon-config\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.095093 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094547 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwc7t\" (UniqueName: \"kubernetes.io/projected/7c713c7a-430f-48a4-9274-ca5da277991e-kube-api-access-lwc7t\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094562 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cf0e9c4d-2f86-4ecc-8ccf-c91544212af9-konnectivity-ca\") pod \"konnectivity-agent-gnmlr\" (UID: \"cf0e9c4d-2f86-4ecc-8ccf-c91544212af9\") " pod="kube-system/konnectivity-agent-gnmlr" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094577 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22fe4e50-13fd-4ae5-b9a6-1552184b400f-system-cni-dir\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094591 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-sysctl-conf\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094621 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-log-socket\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094639 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ac6ac62-b297-4aad-a58f-12c981987869-hosts-file\") pod \"node-resolver-f54jc\" (UID: \"1ac6ac62-b297-4aad-a58f-12c981987869\") " pod="openshift-dns/node-resolver-f54jc" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094653 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-os-release\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094672 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-var-lib-cni-bin\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094705 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-var-lib-kubelet\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094737 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llrwt\" (UniqueName: \"kubernetes.io/projected/22fe4e50-13fd-4ae5-b9a6-1552184b400f-kube-api-access-llrwt\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094764 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbr2w\" (UniqueName: \"kubernetes.io/projected/053402a3-0f05-4423-9697-95ba118cec9c-kube-api-access-fbr2w\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094790 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-run-ovn\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094813 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22fe4e50-13fd-4ae5-b9a6-1552184b400f-os-release\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094852 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-hostroot\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094869 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-multus-conf-dir\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094883 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/913b5ebf-9596-434b-8ff9-ccc478779f3c-iptables-alerter-script\") pod \"iptables-alerter-m7bkv\" (UID: \"913b5ebf-9596-434b-8ff9-ccc478779f3c\") " pod="openshift-network-operator/iptables-alerter-m7bkv" Apr 17 17:24:12.095764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094897 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-run-netns\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.096363 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094941 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-modprobe-d\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.096363 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094970 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-var-lib-kubelet\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.096363 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.094988 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ac6ac62-b297-4aad-a58f-12c981987869-tmp-dir\") pod \"node-resolver-f54jc\" (UID: \"1ac6ac62-b297-4aad-a58f-12c981987869\") " pod="openshift-dns/node-resolver-f54jc" Apr 17 17:24:12.096363 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.095007 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-multus-cni-dir\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.096363 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.095023 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-run\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.096363 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.096159 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:24:12.096650 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.096412 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:24:12.096650 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.096458 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:24:12.096650 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.096564 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-b926f\"" Apr 17 17:24:12.097168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.097149 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-42rvn\"" Apr 17 17:24:12.097899 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.097879 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:12.097985 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.097923 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sx926\"" Apr 17 17:24:12.098042 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.097986 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:12.098090 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.098045 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:24:12.104899 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.104865 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:19:11 +0000 UTC" deadline="2027-09-29 09:03:07.783244417 +0000 UTC" Apr 17 17:24:12.104899 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.104896 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12711h38m55.6783506s" Apr 17 17:24:12.108303 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.108282 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-143-59.ec2.internal"] Apr 17 17:24:12.109736 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:12.109710 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod727c55257da0e2451831c00f9fa5ff7d.slice/crio-84cfbb0e67df8fc0317047dc17275c938c68e42e24ebd7c54c93e00f6fb833ad WatchSource:0}: Error finding container 84cfbb0e67df8fc0317047dc17275c938c68e42e24ebd7c54c93e00f6fb833ad: Status 404 returned error can't find the container with id 84cfbb0e67df8fc0317047dc17275c938c68e42e24ebd7c54c93e00f6fb833ad Apr 17 17:24:12.109978 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.109957 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:24:12.110055 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.110043 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" Apr 17 17:24:12.110707 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:12.110688 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38e6c6efcebb2223fd3833916862d318.slice/crio-74e1f89f8335fa43ed0888d70869545ba89bbfc9e23cb42a74d9d6c730043f1b WatchSource:0}: Error finding container 74e1f89f8335fa43ed0888d70869545ba89bbfc9e23cb42a74d9d6c730043f1b: Status 404 returned error can't find the container with id 74e1f89f8335fa43ed0888d70869545ba89bbfc9e23cb42a74d9d6c730043f1b Apr 17 17:24:12.112677 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.112656 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:24:12.114625 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.114611 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:24:12.129482 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.129464 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:24:12.129591 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.129573 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal"] Apr 17 17:24:12.137240 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.137222 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rbrsp" Apr 17 17:24:12.147250 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.146639 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rbrsp" Apr 17 17:24:12.188503 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.188479 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:24:12.192091 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.192040 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-59.ec2.internal" event={"ID":"38e6c6efcebb2223fd3833916862d318","Type":"ContainerStarted","Data":"74e1f89f8335fa43ed0888d70869545ba89bbfc9e23cb42a74d9d6c730043f1b"} Apr 17 17:24:12.193000 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.192981 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" event={"ID":"727c55257da0e2451831c00f9fa5ff7d","Type":"ContainerStarted","Data":"84cfbb0e67df8fc0317047dc17275c938c68e42e24ebd7c54c93e00f6fb833ad"} Apr 17 17:24:12.195182 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195166 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-run-k8s-cni-cncf-io\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.195243 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195191 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2jv2\" (UniqueName: \"kubernetes.io/projected/913b5ebf-9596-434b-8ff9-ccc478779f3c-kube-api-access-q2jv2\") pod \"iptables-alerter-m7bkv\" (UID: \"913b5ebf-9596-434b-8ff9-ccc478779f3c\") " pod="openshift-network-operator/iptables-alerter-m7bkv" Apr 17 17:24:12.195243 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195207 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-tmp\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.195243 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195224 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0680fa77-380e-4eb4-acfb-45207e761c9b-env-overrides\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.195243 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195239 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0680fa77-380e-4eb4-acfb-45207e761c9b-ovn-node-metrics-cert\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.195390 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195296 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-run-k8s-cni-cncf-io\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.195442 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195379 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-socket-dir\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.195442 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195425 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22fe4e50-13fd-4ae5-b9a6-1552184b400f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.195540 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195454 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/22fe4e50-13fd-4ae5-b9a6-1552184b400f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.195540 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195482 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-kubernetes\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.195540 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195508 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ll79\" (UniqueName: \"kubernetes.io/projected/a04f786b-b1fc-4078-9d22-1b263b785992-kube-api-access-6ll79\") pod \"node-ca-rkk2d\" (UID: \"a04f786b-b1fc-4078-9d22-1b263b785992\") " pod="openshift-image-registry/node-ca-rkk2d" Apr 17 17:24:12.195672 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195538 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-kubelet\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.195672 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195587 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-kubelet\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.195672 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195583 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-socket-dir\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.195672 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195614 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:24:12.195672 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195628 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-kubernetes\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195674 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-run-openvswitch\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195702 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0680fa77-380e-4eb4-acfb-45207e761c9b-ovnkube-config\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195737 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22fe4e50-13fd-4ae5-b9a6-1552184b400f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195742 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-run-openvswitch\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195767 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0680fa77-380e-4eb4-acfb-45207e761c9b-env-overrides\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195777 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-var-lib-openvswitch\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195805 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-system-cni-dir\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195829 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-run-netns\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195832 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-var-lib-openvswitch\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195876 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-run-multus-certs\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195901 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-etc-kubernetes\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195905 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-system-cni-dir\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195928 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/22fe4e50-13fd-4ae5-b9a6-1552184b400f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.195931 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195942 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-run-netns\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195957 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-sysconfig\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195978 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-etc-kubernetes\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.195981 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-sysctl-d\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196010 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196020 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/22fe4e50-13fd-4ae5-b9a6-1552184b400f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196031 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-run-multus-certs\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196035 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbltx\" (UniqueName: \"kubernetes.io/projected/0680fa77-380e-4eb4-acfb-45207e761c9b-kube-api-access-sbltx\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196081 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-multus-socket-dir-parent\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196089 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196109 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-var-lib-cni-multus\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196135 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/913b5ebf-9596-434b-8ff9-ccc478779f3c-host-slash\") pod \"iptables-alerter-m7bkv\" (UID: \"913b5ebf-9596-434b-8ff9-ccc478779f3c\") " pod="openshift-network-operator/iptables-alerter-m7bkv" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196142 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-sysconfig\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196159 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-cni-netd\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196183 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-etc-selinux\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196187 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-multus-socket-dir-parent\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196192 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0680fa77-380e-4eb4-acfb-45207e761c9b-ovnkube-config\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.196466 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196207 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-systemd\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196235 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-sys\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196243 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/913b5ebf-9596-434b-8ff9-ccc478779f3c-host-slash\") pod \"iptables-alerter-m7bkv\" (UID: \"913b5ebf-9596-434b-8ff9-ccc478779f3c\") " pod="openshift-network-operator/iptables-alerter-m7bkv" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196258 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-lib-modules\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196282 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-systemd-units\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196289 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-var-lib-cni-multus\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196301 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-sysctl-d\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196331 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-systemd-units\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196340 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-sys\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196342 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-systemd\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196382 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-lib-modules\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196383 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-cni-netd\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196383 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86drl\" (UniqueName: \"kubernetes.io/projected/03e6b129-19be-4d65-adaa-3b9d0e336aba-kube-api-access-86drl\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196404 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/22fe4e50-13fd-4ae5-b9a6-1552184b400f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196429 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-etc-selinux\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196431 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22fe4e50-13fd-4ae5-b9a6-1552184b400f-cnibin\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196458 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22fe4e50-13fd-4ae5-b9a6-1552184b400f-cnibin\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.197227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196468 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-host\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196488 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196508 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbwm\" (UniqueName: \"kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm\") pod \"network-check-target-kzg6t\" (UID: \"cee37054-a7f8-4e3d-8733-e19de26444e7\") " pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196520 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-host\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196524 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a04f786b-b1fc-4078-9d22-1b263b785992-serviceca\") pod \"node-ca-rkk2d\" (UID: \"a04f786b-b1fc-4078-9d22-1b263b785992\") " pod="openshift-image-registry/node-ca-rkk2d" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.196635 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.196713 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs podName:053402a3-0f05-4423-9697-95ba118cec9c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:12.696677596 +0000 UTC m=+2.029295229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs") pod "network-metrics-daemon-clsml" (UID: "053402a3-0f05-4423-9697-95ba118cec9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196751 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-tuned\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196781 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsbgt\" (UniqueName: \"kubernetes.io/projected/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-kube-api-access-lsbgt\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196807 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-etc-openvswitch\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196832 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0680fa77-380e-4eb4-acfb-45207e761c9b-ovnkube-script-lib\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196873 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7c713c7a-430f-48a4-9274-ca5da277991e-cni-binary-copy\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196934 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cf0e9c4d-2f86-4ecc-8ccf-c91544212af9-agent-certs\") pod \"konnectivity-agent-gnmlr\" (UID: \"cf0e9c4d-2f86-4ecc-8ccf-c91544212af9\") " pod="kube-system/konnectivity-agent-gnmlr" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196960 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-registration-dir\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.196987 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-device-dir\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197014 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-sys-fs\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197038 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22fe4e50-13fd-4ae5-b9a6-1552184b400f-cni-binary-copy\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.198058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197061 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a04f786b-b1fc-4078-9d22-1b263b785992-host\") pod \"node-ca-rkk2d\" (UID: \"a04f786b-b1fc-4078-9d22-1b263b785992\") " pod="openshift-image-registry/node-ca-rkk2d" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197086 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-node-log\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197113 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197140 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197167 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-cnibin\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197192 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7c713c7a-430f-48a4-9274-ca5da277991e-multus-daemon-config\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197217 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwc7t\" (UniqueName: \"kubernetes.io/projected/7c713c7a-430f-48a4-9274-ca5da277991e-kube-api-access-lwc7t\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197209 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-sys-fs\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197246 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cf0e9c4d-2f86-4ecc-8ccf-c91544212af9-konnectivity-ca\") pod \"konnectivity-agent-gnmlr\" (UID: \"cf0e9c4d-2f86-4ecc-8ccf-c91544212af9\") " pod="kube-system/konnectivity-agent-gnmlr" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197276 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a04f786b-b1fc-4078-9d22-1b263b785992-host\") pod \"node-ca-rkk2d\" (UID: \"a04f786b-b1fc-4078-9d22-1b263b785992\") " pod="openshift-image-registry/node-ca-rkk2d" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197291 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22fe4e50-13fd-4ae5-b9a6-1552184b400f-system-cni-dir\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197308 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-registration-dir\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197319 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-sysctl-conf\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197357 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-log-socket\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197408 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ac6ac62-b297-4aad-a58f-12c981987869-hosts-file\") pod \"node-resolver-f54jc\" (UID: \"1ac6ac62-b297-4aad-a58f-12c981987869\") " pod="openshift-dns/node-resolver-f54jc" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197433 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-sysctl-conf\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197433 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-os-release\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.198871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197480 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-var-lib-cni-bin\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197509 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-var-lib-kubelet\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197518 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7c713c7a-430f-48a4-9274-ca5da277991e-cni-binary-copy\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197532 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-log-socket\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197536 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llrwt\" (UniqueName: \"kubernetes.io/projected/22fe4e50-13fd-4ae5-b9a6-1552184b400f-kube-api-access-llrwt\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197581 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-etc-openvswitch\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197585 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1ac6ac62-b297-4aad-a58f-12c981987869-hosts-file\") pod \"node-resolver-f54jc\" (UID: \"1ac6ac62-b297-4aad-a58f-12c981987869\") " pod="openshift-dns/node-resolver-f54jc" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197614 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbr2w\" (UniqueName: \"kubernetes.io/projected/053402a3-0f05-4423-9697-95ba118cec9c-kube-api-access-fbr2w\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197641 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-run-ovn\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197649 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22fe4e50-13fd-4ae5-b9a6-1552184b400f-cni-binary-copy\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197676 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22fe4e50-13fd-4ae5-b9a6-1552184b400f-os-release\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197704 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-hostroot\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197729 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-multus-conf-dir\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197755 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/913b5ebf-9596-434b-8ff9-ccc478779f3c-iptables-alerter-script\") pod \"iptables-alerter-m7bkv\" (UID: \"913b5ebf-9596-434b-8ff9-ccc478779f3c\") " pod="openshift-network-operator/iptables-alerter-m7bkv" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197782 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-run-netns\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197809 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-modprobe-d\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197856 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-var-lib-kubelet\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197882 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ac6ac62-b297-4aad-a58f-12c981987869-tmp-dir\") pod \"node-resolver-f54jc\" (UID: \"1ac6ac62-b297-4aad-a58f-12c981987869\") " pod="openshift-dns/node-resolver-f54jc" Apr 17 17:24:12.199630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197484 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-os-release\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197911 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-var-lib-cni-bin\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197980 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-run-ovn\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198006 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-multus-cni-dir\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.197911 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-multus-cni-dir\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198063 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22fe4e50-13fd-4ae5-b9a6-1552184b400f-os-release\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198069 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-device-dir\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198063 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-run\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198100 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-run\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198105 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-slash\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198109 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7c713c7a-430f-48a4-9274-ca5da277991e-multus-daemon-config\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198134 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-cni-bin\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198171 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-run-systemd\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198189 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03e6b129-19be-4d65-adaa-3b9d0e336aba-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198197 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27fz2\" (UniqueName: \"kubernetes.io/projected/1ac6ac62-b297-4aad-a58f-12c981987869-kube-api-access-27fz2\") pod \"node-resolver-f54jc\" (UID: \"1ac6ac62-b297-4aad-a58f-12c981987869\") " pod="openshift-dns/node-resolver-f54jc" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198240 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-cnibin\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198279 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-hostroot\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198326 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-multus-conf-dir\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.200358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198456 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-node-log\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198471 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a04f786b-b1fc-4078-9d22-1b263b785992-serviceca\") pod \"node-ca-rkk2d\" (UID: \"a04f786b-b1fc-4078-9d22-1b263b785992\") " pod="openshift-image-registry/node-ca-rkk2d" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198540 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22fe4e50-13fd-4ae5-b9a6-1552184b400f-system-cni-dir\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198584 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-cni-bin\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198618 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-slash\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198679 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-run-systemd\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198697 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c713c7a-430f-48a4-9274-ca5da277991e-host-var-lib-kubelet\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198766 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198780 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1ac6ac62-b297-4aad-a58f-12c981987869-tmp-dir\") pod \"node-resolver-f54jc\" (UID: \"1ac6ac62-b297-4aad-a58f-12c981987869\") " pod="openshift-dns/node-resolver-f54jc" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198822 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0680fa77-380e-4eb4-acfb-45207e761c9b-host-run-netns\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198833 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cf0e9c4d-2f86-4ecc-8ccf-c91544212af9-konnectivity-ca\") pod \"konnectivity-agent-gnmlr\" (UID: \"cf0e9c4d-2f86-4ecc-8ccf-c91544212af9\") " pod="kube-system/konnectivity-agent-gnmlr" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198857 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-modprobe-d\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198864 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-var-lib-kubelet\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.198895 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/913b5ebf-9596-434b-8ff9-ccc478779f3c-iptables-alerter-script\") pod \"iptables-alerter-m7bkv\" (UID: \"913b5ebf-9596-434b-8ff9-ccc478779f3c\") " pod="openshift-network-operator/iptables-alerter-m7bkv" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.199092 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0680fa77-380e-4eb4-acfb-45207e761c9b-ovnkube-script-lib\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.199292 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-tmp\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.200280 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cf0e9c4d-2f86-4ecc-8ccf-c91544212af9-agent-certs\") pod \"konnectivity-agent-gnmlr\" (UID: \"cf0e9c4d-2f86-4ecc-8ccf-c91544212af9\") " pod="kube-system/konnectivity-agent-gnmlr" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.200385 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0680fa77-380e-4eb4-acfb-45207e761c9b-ovn-node-metrics-cert\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.200823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.200464 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-etc-tuned\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.211011 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.210558 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:12.211011 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.210582 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:12.211011 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.210594 2565 projected.go:194] Error preparing data for projected volume kube-api-access-pwbwm for pod openshift-network-diagnostics/network-check-target-kzg6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:12.211011 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.210663 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm podName:cee37054-a7f8-4e3d-8733-e19de26444e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:12.71064496 +0000 UTC m=+2.043262584 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pwbwm" (UniqueName: "kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm") pod "network-check-target-kzg6t" (UID: "cee37054-a7f8-4e3d-8733-e19de26444e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:12.213400 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.213339 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27fz2\" (UniqueName: \"kubernetes.io/projected/1ac6ac62-b297-4aad-a58f-12c981987869-kube-api-access-27fz2\") pod \"node-resolver-f54jc\" (UID: \"1ac6ac62-b297-4aad-a58f-12c981987869\") " pod="openshift-dns/node-resolver-f54jc" Apr 17 17:24:12.213400 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.213369 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbltx\" (UniqueName: \"kubernetes.io/projected/0680fa77-380e-4eb4-acfb-45207e761c9b-kube-api-access-sbltx\") pod \"ovnkube-node-4pk44\" (UID: \"0680fa77-380e-4eb4-acfb-45207e761c9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.213543 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.213437 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2jv2\" (UniqueName: \"kubernetes.io/projected/913b5ebf-9596-434b-8ff9-ccc478779f3c-kube-api-access-q2jv2\") pod \"iptables-alerter-m7bkv\" (UID: \"913b5ebf-9596-434b-8ff9-ccc478779f3c\") " pod="openshift-network-operator/iptables-alerter-m7bkv" Apr 17 17:24:12.213543 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.213440 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsbgt\" (UniqueName: \"kubernetes.io/projected/d3dd37cb-53dd-44c5-8770-bbeeca8e02bc-kube-api-access-lsbgt\") pod \"tuned-5zr7s\" (UID: \"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc\") " pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.213543 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.213531 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86drl\" (UniqueName: \"kubernetes.io/projected/03e6b129-19be-4d65-adaa-3b9d0e336aba-kube-api-access-86drl\") pod \"aws-ebs-csi-driver-node-lzrd4\" (UID: \"03e6b129-19be-4d65-adaa-3b9d0e336aba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.213969 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.213946 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ll79\" (UniqueName: \"kubernetes.io/projected/a04f786b-b1fc-4078-9d22-1b263b785992-kube-api-access-6ll79\") pod \"node-ca-rkk2d\" (UID: \"a04f786b-b1fc-4078-9d22-1b263b785992\") " pod="openshift-image-registry/node-ca-rkk2d" Apr 17 17:24:12.214079 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.214061 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbr2w\" (UniqueName: \"kubernetes.io/projected/053402a3-0f05-4423-9697-95ba118cec9c-kube-api-access-fbr2w\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:12.214370 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.214355 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwc7t\" (UniqueName: \"kubernetes.io/projected/7c713c7a-430f-48a4-9274-ca5da277991e-kube-api-access-lwc7t\") pod \"multus-t2kcl\" (UID: \"7c713c7a-430f-48a4-9274-ca5da277991e\") " pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.214471 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.214452 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llrwt\" (UniqueName: \"kubernetes.io/projected/22fe4e50-13fd-4ae5-b9a6-1552184b400f-kube-api-access-llrwt\") pod \"multus-additional-cni-plugins-2hc4g\" (UID: \"22fe4e50-13fd-4ae5-b9a6-1552184b400f\") " pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.282597 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.282570 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:12.407258 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.407181 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gnmlr" Apr 17 17:24:12.413264 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:12.413234 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf0e9c4d_2f86_4ecc_8ccf_c91544212af9.slice/crio-2bb9342d1058ec80194f16eace3ea1cee5c3006ba7b69b151140fe3af86ff9a0 WatchSource:0}: Error finding container 2bb9342d1058ec80194f16eace3ea1cee5c3006ba7b69b151140fe3af86ff9a0: Status 404 returned error can't find the container with id 2bb9342d1058ec80194f16eace3ea1cee5c3006ba7b69b151140fe3af86ff9a0 Apr 17 17:24:12.416173 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.416155 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" Apr 17 17:24:12.422680 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:12.422654 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03e6b129_19be_4d65_adaa_3b9d0e336aba.slice/crio-a9e807003bf714087e567e48705c6cfbe69f696104c63e6984437afeac79f842 WatchSource:0}: Error finding container a9e807003bf714087e567e48705c6cfbe69f696104c63e6984437afeac79f842: Status 404 returned error can't find the container with id a9e807003bf714087e567e48705c6cfbe69f696104c63e6984437afeac79f842 Apr 17 17:24:12.423774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.423757 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rkk2d" Apr 17 17:24:12.429778 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.428680 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2hc4g" Apr 17 17:24:12.430988 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:12.430967 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda04f786b_b1fc_4078_9d22_1b263b785992.slice/crio-697d578ca5236fa84cd39126ebdd480e67bc2dfc058a1f02a764f6528a7686f1 WatchSource:0}: Error finding container 697d578ca5236fa84cd39126ebdd480e67bc2dfc058a1f02a764f6528a7686f1: Status 404 returned error can't find the container with id 697d578ca5236fa84cd39126ebdd480e67bc2dfc058a1f02a764f6528a7686f1 Apr 17 17:24:12.434149 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.434120 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:12.436040 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:12.436019 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22fe4e50_13fd_4ae5_b9a6_1552184b400f.slice/crio-a2fab0c83523aa43b89a23d75e71664ea7c2f386076adaa628dbe4e71f29f635 WatchSource:0}: Error finding container a2fab0c83523aa43b89a23d75e71664ea7c2f386076adaa628dbe4e71f29f635: Status 404 returned error can't find the container with id a2fab0c83523aa43b89a23d75e71664ea7c2f386076adaa628dbe4e71f29f635 Apr 17 17:24:12.439065 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.439046 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" Apr 17 17:24:12.440618 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:12.440599 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0680fa77_380e_4eb4_acfb_45207e761c9b.slice/crio-b634f790c371fc02472590558f761f96dc5e36282e289246a3c56cbd09d90d01 WatchSource:0}: Error finding container b634f790c371fc02472590558f761f96dc5e36282e289246a3c56cbd09d90d01: Status 404 returned error can't find the container with id b634f790c371fc02472590558f761f96dc5e36282e289246a3c56cbd09d90d01 Apr 17 17:24:12.444710 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.444691 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f54jc" Apr 17 17:24:12.446418 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:12.446394 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3dd37cb_53dd_44c5_8770_bbeeca8e02bc.slice/crio-141e8d260d12f4a6a8f437376c7e3d9db23490b899a6e24847dda66580977487 WatchSource:0}: Error finding container 141e8d260d12f4a6a8f437376c7e3d9db23490b899a6e24847dda66580977487: Status 404 returned error can't find the container with id 141e8d260d12f4a6a8f437376c7e3d9db23490b899a6e24847dda66580977487 Apr 17 17:24:12.451022 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.450934 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t2kcl" Apr 17 17:24:12.452436 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:12.452405 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ac6ac62_b297_4aad_a58f_12c981987869.slice/crio-89137b06434bd68765f0f1077c3b7a9d14475187b9e241d48756195f74dc8cd9 WatchSource:0}: Error finding container 89137b06434bd68765f0f1077c3b7a9d14475187b9e241d48756195f74dc8cd9: Status 404 returned error can't find the container with id 89137b06434bd68765f0f1077c3b7a9d14475187b9e241d48756195f74dc8cd9 Apr 17 17:24:12.455974 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.455750 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m7bkv" Apr 17 17:24:12.461024 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:12.461002 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c713c7a_430f_48a4_9274_ca5da277991e.slice/crio-a2ae4d7f17086d0b0cf1d707c61353c7871b7619230bf86aae85b3b6581e4579 WatchSource:0}: Error finding container a2ae4d7f17086d0b0cf1d707c61353c7871b7619230bf86aae85b3b6581e4579: Status 404 returned error can't find the container with id a2ae4d7f17086d0b0cf1d707c61353c7871b7619230bf86aae85b3b6581e4579 Apr 17 17:24:12.464544 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:12.464525 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod913b5ebf_9596_434b_8ff9_ccc478779f3c.slice/crio-bdaf66862f291180d6ea6ef4bdaa7b15dea0911603bedf520a93a20554eb4454 WatchSource:0}: Error finding container bdaf66862f291180d6ea6ef4bdaa7b15dea0911603bedf520a93a20554eb4454: Status 404 returned error can't find the container with id bdaf66862f291180d6ea6ef4bdaa7b15dea0911603bedf520a93a20554eb4454 Apr 17 17:24:12.626585 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.626556 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:12.701197 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.701125 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:12.701341 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.701294 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:12.701396 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.701372 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs podName:053402a3-0f05-4423-9697-95ba118cec9c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:13.701351671 +0000 UTC m=+3.033969297 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs") pod "network-metrics-daemon-clsml" (UID: "053402a3-0f05-4423-9697-95ba118cec9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:12.802798 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.802129 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbwm\" (UniqueName: \"kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm\") pod \"network-check-target-kzg6t\" (UID: \"cee37054-a7f8-4e3d-8733-e19de26444e7\") " pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:12.802798 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.802304 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:12.802798 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.802321 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:12.802798 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.802334 2565 projected.go:194] Error preparing data for projected volume kube-api-access-pwbwm for pod openshift-network-diagnostics/network-check-target-kzg6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:12.802798 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:12.802397 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm podName:cee37054-a7f8-4e3d-8733-e19de26444e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:13.802377124 +0000 UTC m=+3.134994765 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pwbwm" (UniqueName: "kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm") pod "network-check-target-kzg6t" (UID: "cee37054-a7f8-4e3d-8733-e19de26444e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:12.881228 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:12.881152 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:13.147618 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.147531 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:19:12 +0000 UTC" deadline="2027-12-16 08:55:11.682905251 +0000 UTC" Apr 17 17:24:13.147618 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.147574 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14583h30m58.535336295s" Apr 17 17:24:13.194135 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.194079 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:13.194331 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:13.194223 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:13.199390 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.199356 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m7bkv" event={"ID":"913b5ebf-9596-434b-8ff9-ccc478779f3c","Type":"ContainerStarted","Data":"bdaf66862f291180d6ea6ef4bdaa7b15dea0911603bedf520a93a20554eb4454"} Apr 17 17:24:13.217738 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.217698 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t2kcl" event={"ID":"7c713c7a-430f-48a4-9274-ca5da277991e","Type":"ContainerStarted","Data":"a2ae4d7f17086d0b0cf1d707c61353c7871b7619230bf86aae85b3b6581e4579"} Apr 17 17:24:13.234958 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.234670 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" event={"ID":"0680fa77-380e-4eb4-acfb-45207e761c9b","Type":"ContainerStarted","Data":"b634f790c371fc02472590558f761f96dc5e36282e289246a3c56cbd09d90d01"} Apr 17 17:24:13.243784 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.243678 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hc4g" event={"ID":"22fe4e50-13fd-4ae5-b9a6-1552184b400f","Type":"ContainerStarted","Data":"a2fab0c83523aa43b89a23d75e71664ea7c2f386076adaa628dbe4e71f29f635"} Apr 17 17:24:13.257326 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.257258 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gnmlr" event={"ID":"cf0e9c4d-2f86-4ecc-8ccf-c91544212af9","Type":"ContainerStarted","Data":"2bb9342d1058ec80194f16eace3ea1cee5c3006ba7b69b151140fe3af86ff9a0"} Apr 17 17:24:13.259568 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.259525 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f54jc" event={"ID":"1ac6ac62-b297-4aad-a58f-12c981987869","Type":"ContainerStarted","Data":"89137b06434bd68765f0f1077c3b7a9d14475187b9e241d48756195f74dc8cd9"} Apr 17 17:24:13.281226 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.281152 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" event={"ID":"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc","Type":"ContainerStarted","Data":"141e8d260d12f4a6a8f437376c7e3d9db23490b899a6e24847dda66580977487"} Apr 17 17:24:13.287055 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.287019 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rkk2d" event={"ID":"a04f786b-b1fc-4078-9d22-1b263b785992","Type":"ContainerStarted","Data":"697d578ca5236fa84cd39126ebdd480e67bc2dfc058a1f02a764f6528a7686f1"} Apr 17 17:24:13.297380 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.297345 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" event={"ID":"03e6b129-19be-4d65-adaa-3b9d0e336aba","Type":"ContainerStarted","Data":"a9e807003bf714087e567e48705c6cfbe69f696104c63e6984437afeac79f842"} Apr 17 17:24:13.710909 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.710832 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:13.711106 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:13.711040 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:13.711168 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:13.711111 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs podName:053402a3-0f05-4423-9697-95ba118cec9c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:15.711089649 +0000 UTC m=+5.043707276 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs") pod "network-metrics-daemon-clsml" (UID: "053402a3-0f05-4423-9697-95ba118cec9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:13.811908 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:13.811704 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbwm\" (UniqueName: \"kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm\") pod \"network-check-target-kzg6t\" (UID: \"cee37054-a7f8-4e3d-8733-e19de26444e7\") " pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:13.811908 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:13.811896 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:13.811908 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:13.811915 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:13.812193 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:13.811928 2565 projected.go:194] Error preparing data for projected volume kube-api-access-pwbwm for pod openshift-network-diagnostics/network-check-target-kzg6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:13.812193 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:13.811987 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm podName:cee37054-a7f8-4e3d-8733-e19de26444e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:15.811968503 +0000 UTC m=+5.144586130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pwbwm" (UniqueName: "kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm") pod "network-check-target-kzg6t" (UID: "cee37054-a7f8-4e3d-8733-e19de26444e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:14.148617 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:14.148486 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:19:12 +0000 UTC" deadline="2027-10-23 21:06:01.10830648 +0000 UTC" Apr 17 17:24:14.148617 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:14.148525 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13299h41m46.959786061s" Apr 17 17:24:14.190560 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:14.190067 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:14.190560 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:14.190203 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:15.189558 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.189523 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:15.190090 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:15.189668 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:15.402525 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.402490 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bf9w9"] Apr 17 17:24:15.404473 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.404450 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:15.404585 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:15.404534 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:15.423032 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.423000 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:15.423188 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.423129 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-dbus\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:15.423249 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.423184 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-kubelet-config\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:15.523975 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.523890 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-kubelet-config\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:15.523975 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.523958 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:15.524200 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.524026 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-dbus\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:15.524252 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.524225 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-dbus\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:15.524315 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.524296 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-kubelet-config\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:15.524419 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:15.524399 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:15.524489 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:15.524467 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret podName:7b024619-6a7c-49fc-b69c-a21fab2b9f5c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:16.024448707 +0000 UTC m=+5.357066331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret") pod "global-pull-secret-syncer-bf9w9" (UID: "7b024619-6a7c-49fc-b69c-a21fab2b9f5c") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:15.727124 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.726505 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:15.727124 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:15.726692 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:15.727124 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:15.726755 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs podName:053402a3-0f05-4423-9697-95ba118cec9c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:19.726737779 +0000 UTC m=+9.059355409 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs") pod "network-metrics-daemon-clsml" (UID: "053402a3-0f05-4423-9697-95ba118cec9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:15.827192 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:15.827068 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbwm\" (UniqueName: \"kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm\") pod \"network-check-target-kzg6t\" (UID: \"cee37054-a7f8-4e3d-8733-e19de26444e7\") " pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:15.827357 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:15.827267 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:15.827357 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:15.827295 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:15.827357 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:15.827311 2565 projected.go:194] Error preparing data for projected volume kube-api-access-pwbwm for pod openshift-network-diagnostics/network-check-target-kzg6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:15.827515 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:15.827372 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm podName:cee37054-a7f8-4e3d-8733-e19de26444e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:19.827353123 +0000 UTC m=+9.159970762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-pwbwm" (UniqueName: "kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm") pod "network-check-target-kzg6t" (UID: "cee37054-a7f8-4e3d-8733-e19de26444e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:16.029197 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:16.028485 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:16.029197 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:16.028688 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:16.029197 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:16.028755 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret podName:7b024619-6a7c-49fc-b69c-a21fab2b9f5c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:17.028737226 +0000 UTC m=+6.361354858 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret") pod "global-pull-secret-syncer-bf9w9" (UID: "7b024619-6a7c-49fc-b69c-a21fab2b9f5c") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:16.190066 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:16.189984 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:16.190540 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:16.190115 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:17.037589 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:17.037552 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:17.037774 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:17.037705 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:17.037828 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:17.037777 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret podName:7b024619-6a7c-49fc-b69c-a21fab2b9f5c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:19.037758118 +0000 UTC m=+8.370375828 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret") pod "global-pull-secret-syncer-bf9w9" (UID: "7b024619-6a7c-49fc-b69c-a21fab2b9f5c") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:17.189764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:17.189730 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:17.189955 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:17.189868 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:17.190269 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:17.190248 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:17.190648 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:17.190346 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:18.189813 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:18.189752 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:18.190000 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:18.189938 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:19.053370 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:19.053328 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:19.053808 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:19.053512 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:19.053808 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:19.053590 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret podName:7b024619-6a7c-49fc-b69c-a21fab2b9f5c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:23.053570715 +0000 UTC m=+12.386188346 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret") pod "global-pull-secret-syncer-bf9w9" (UID: "7b024619-6a7c-49fc-b69c-a21fab2b9f5c") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:19.193476 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:19.193076 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:19.193476 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:19.193110 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:19.193476 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:19.193201 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:19.193721 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:19.193593 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:19.759132 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:19.759096 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:19.759313 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:19.759257 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:19.759385 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:19.759331 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs podName:053402a3-0f05-4423-9697-95ba118cec9c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:27.759311911 +0000 UTC m=+17.091929535 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs") pod "network-metrics-daemon-clsml" (UID: "053402a3-0f05-4423-9697-95ba118cec9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:19.859877 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:19.859824 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbwm\" (UniqueName: \"kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm\") pod \"network-check-target-kzg6t\" (UID: \"cee37054-a7f8-4e3d-8733-e19de26444e7\") " pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:19.860041 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:19.860002 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:19.860041 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:19.860026 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:19.860041 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:19.860037 2565 projected.go:194] Error preparing data for projected volume kube-api-access-pwbwm for pod openshift-network-diagnostics/network-check-target-kzg6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:19.860194 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:19.860101 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm podName:cee37054-a7f8-4e3d-8733-e19de26444e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:27.86008079 +0000 UTC m=+17.192698420 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-pwbwm" (UniqueName: "kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm") pod "network-check-target-kzg6t" (UID: "cee37054-a7f8-4e3d-8733-e19de26444e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:20.190089 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:20.190009 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:20.190542 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:20.190163 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:21.194268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:21.194233 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:21.194704 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:21.194355 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:21.194704 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:21.194233 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:21.194861 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:21.194785 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:22.189873 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:22.189830 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:22.190041 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:22.189940 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:23.084313 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:23.084275 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:23.084751 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:23.084441 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:23.084751 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:23.084523 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret podName:7b024619-6a7c-49fc-b69c-a21fab2b9f5c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:31.084500329 +0000 UTC m=+20.417117968 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret") pod "global-pull-secret-syncer-bf9w9" (UID: "7b024619-6a7c-49fc-b69c-a21fab2b9f5c") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:23.189683 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:23.189641 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:23.189863 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:23.189779 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:23.189863 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:23.189802 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:23.189971 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:23.189910 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:24.189499 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:24.189463 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:24.190010 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:24.189608 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:25.189237 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:25.189198 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:25.189427 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:25.189331 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:25.189427 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:25.189382 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:25.189546 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:25.189499 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:26.189854 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:26.189804 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:26.190378 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:26.189948 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:27.189799 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:27.189761 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:27.189979 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:27.189808 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:27.189979 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:27.189919 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:27.190341 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:27.190058 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:27.819011 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:27.818969 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:27.819227 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:27.819124 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:27.819227 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:27.819201 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs podName:053402a3-0f05-4423-9697-95ba118cec9c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:43.819180663 +0000 UTC m=+33.151798298 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs") pod "network-metrics-daemon-clsml" (UID: "053402a3-0f05-4423-9697-95ba118cec9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:27.920053 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:27.920018 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbwm\" (UniqueName: \"kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm\") pod \"network-check-target-kzg6t\" (UID: \"cee37054-a7f8-4e3d-8733-e19de26444e7\") " pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:27.920214 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:27.920148 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:27.920214 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:27.920168 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:27.920214 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:27.920181 2565 projected.go:194] Error preparing data for projected volume kube-api-access-pwbwm for pod openshift-network-diagnostics/network-check-target-kzg6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:27.920341 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:27.920235 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm podName:cee37054-a7f8-4e3d-8733-e19de26444e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:43.920223507 +0000 UTC m=+33.252841131 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-pwbwm" (UniqueName: "kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm") pod "network-check-target-kzg6t" (UID: "cee37054-a7f8-4e3d-8733-e19de26444e7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:28.189976 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:28.189895 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:28.190411 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:28.190026 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:29.189854 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:29.189803 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:29.189854 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:29.189852 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:29.190385 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:29.189944 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:29.190385 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:29.190046 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:30.189297 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:30.189124 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:30.189430 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:30.189405 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:30.335229 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:30.335108 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" event={"ID":"d3dd37cb-53dd-44c5-8770-bbeeca8e02bc","Type":"ContainerStarted","Data":"92e8af70d266eeece4a2406ae7932c3a96f5f9525744bc58b687afcf346fd564"} Apr 17 17:24:30.337019 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:30.336980 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-59.ec2.internal" event={"ID":"38e6c6efcebb2223fd3833916862d318","Type":"ContainerStarted","Data":"8389d5a43cd7da00967821c9a04bca5d4ab6c2c10f4c63624816aef06e603d8c"} Apr 17 17:24:30.338336 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:30.338316 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t2kcl" event={"ID":"7c713c7a-430f-48a4-9274-ca5da277991e","Type":"ContainerStarted","Data":"4971c1aadd5f28576749419a6c518f50a559a178d8c84de004c0ff48852ac4ed"} Apr 17 17:24:30.354567 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:30.354523 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5zr7s" podStartSLOduration=1.771054678 podStartE2EDuration="19.35450925s" podCreationTimestamp="2026-04-17 17:24:11 +0000 UTC" firstStartedPulling="2026-04-17 17:24:12.449056767 +0000 UTC m=+1.781674390" lastFinishedPulling="2026-04-17 17:24:30.032511331 +0000 UTC m=+19.365128962" observedRunningTime="2026-04-17 17:24:30.353193971 +0000 UTC m=+19.685811626" watchObservedRunningTime="2026-04-17 17:24:30.35450925 +0000 UTC m=+19.687126895" Apr 17 17:24:30.369780 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:30.369744 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-t2kcl" podStartSLOduration=1.801375497 podStartE2EDuration="19.369732776s" podCreationTimestamp="2026-04-17 17:24:11 +0000 UTC" firstStartedPulling="2026-04-17 17:24:12.463120943 +0000 UTC m=+1.795738573" lastFinishedPulling="2026-04-17 17:24:30.031478224 +0000 UTC m=+19.364095852" observedRunningTime="2026-04-17 17:24:30.369509743 +0000 UTC m=+19.702127411" watchObservedRunningTime="2026-04-17 17:24:30.369732776 +0000 UTC m=+19.702350421" Apr 17 17:24:30.388906 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:30.388870 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-59.ec2.internal" podStartSLOduration=18.388857004 podStartE2EDuration="18.388857004s" podCreationTimestamp="2026-04-17 17:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:30.387931457 +0000 UTC m=+19.720549102" watchObservedRunningTime="2026-04-17 17:24:30.388857004 +0000 UTC m=+19.721474656" Apr 17 17:24:31.145695 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.143798 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:31.145695 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:31.143989 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:31.145695 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:31.144135 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret podName:7b024619-6a7c-49fc-b69c-a21fab2b9f5c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:47.144117284 +0000 UTC m=+36.476734908 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret") pod "global-pull-secret-syncer-bf9w9" (UID: "7b024619-6a7c-49fc-b69c-a21fab2b9f5c") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:31.179131 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:31.178832 2565 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod727c55257da0e2451831c00f9fa5ff7d.slice/crio-f4497abab5dc72c75b36c1a63da46ca7bc770dca2c4be635f446b57895bed357.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:24:31.190116 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.190090 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:31.190384 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.190190 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:31.190468 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:31.190349 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:31.190522 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:31.190486 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:31.341347 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.341309 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rkk2d" event={"ID":"a04f786b-b1fc-4078-9d22-1b263b785992","Type":"ContainerStarted","Data":"a694967e282a664c2f6031b008badef6dbde637d6f0e55fd6354d41c3876204a"} Apr 17 17:24:31.342571 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.342552 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" event={"ID":"03e6b129-19be-4d65-adaa-3b9d0e336aba","Type":"ContainerStarted","Data":"254d5685c805942cc7410c8f99fbca4acd68f38f8bdf0b75bcd28fee37a04e8c"} Apr 17 17:24:31.344795 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.344775 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" event={"ID":"0680fa77-380e-4eb4-acfb-45207e761c9b","Type":"ContainerStarted","Data":"ec824d4561d829d3b4e3189c52e8f5f907b9906d150d03a83df0ba070c109f8f"} Apr 17 17:24:31.344795 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.344795 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" event={"ID":"0680fa77-380e-4eb4-acfb-45207e761c9b","Type":"ContainerStarted","Data":"8085824b76bcf0801694ad2b793aa4577bd625f5b63e0f87323978ea5aa4e2e7"} Apr 17 17:24:31.344956 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.344804 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" event={"ID":"0680fa77-380e-4eb4-acfb-45207e761c9b","Type":"ContainerStarted","Data":"1b6f72145087d419c2a7957aceab94d78441a577eb06d015eb3527c75dae9293"} Apr 17 17:24:31.344956 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.344815 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" event={"ID":"0680fa77-380e-4eb4-acfb-45207e761c9b","Type":"ContainerStarted","Data":"9a2a40ac2349c464cd1e6e6dcdda167fee5cdad0d237cc0a20527dfe9ad12188"} Apr 17 17:24:31.344956 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.344825 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" event={"ID":"0680fa77-380e-4eb4-acfb-45207e761c9b","Type":"ContainerStarted","Data":"7e5fdfd4c960789fe3a0da45b7a4f02c196a5b29c0107f5c66501ea5ff64b6e5"} Apr 17 17:24:31.344956 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.344834 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" event={"ID":"0680fa77-380e-4eb4-acfb-45207e761c9b","Type":"ContainerStarted","Data":"aa40459d135d1026d40aeba9b913cceb861c3cc774c9d7b5985139c1e0f32c5f"} Apr 17 17:24:31.346149 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.346122 2565 generic.go:358] "Generic (PLEG): container finished" podID="22fe4e50-13fd-4ae5-b9a6-1552184b400f" containerID="06e15ebc59a3d3deb796a1504235649ea13a772373efcf2bbd51ac1d73e7e9a7" exitCode=0 Apr 17 17:24:31.346210 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.346184 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hc4g" event={"ID":"22fe4e50-13fd-4ae5-b9a6-1552184b400f","Type":"ContainerDied","Data":"06e15ebc59a3d3deb796a1504235649ea13a772373efcf2bbd51ac1d73e7e9a7"} Apr 17 17:24:31.347487 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.347385 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gnmlr" event={"ID":"cf0e9c4d-2f86-4ecc-8ccf-c91544212af9","Type":"ContainerStarted","Data":"a408be555dd8926b030555b66ade414b243444258597da9277109b175f146377"} Apr 17 17:24:31.348701 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.348684 2565 generic.go:358] "Generic (PLEG): container finished" podID="727c55257da0e2451831c00f9fa5ff7d" containerID="f4497abab5dc72c75b36c1a63da46ca7bc770dca2c4be635f446b57895bed357" exitCode=0 Apr 17 17:24:31.348781 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.348762 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" event={"ID":"727c55257da0e2451831c00f9fa5ff7d","Type":"ContainerDied","Data":"f4497abab5dc72c75b36c1a63da46ca7bc770dca2c4be635f446b57895bed357"} Apr 17 17:24:31.349896 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.349875 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f54jc" event={"ID":"1ac6ac62-b297-4aad-a58f-12c981987869","Type":"ContainerStarted","Data":"05df1ca367ef01462b14cfeca775a378c37c209a06f2e868d76625bbb59ac511"} Apr 17 17:24:31.375482 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.375386 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-gnmlr" podStartSLOduration=2.799153276 podStartE2EDuration="20.375373978s" podCreationTimestamp="2026-04-17 17:24:11 +0000 UTC" firstStartedPulling="2026-04-17 17:24:12.414780015 +0000 UTC m=+1.747397639" lastFinishedPulling="2026-04-17 17:24:29.991000705 +0000 UTC m=+19.323618341" observedRunningTime="2026-04-17 17:24:31.375160182 +0000 UTC m=+20.707777829" watchObservedRunningTime="2026-04-17 17:24:31.375373978 +0000 UTC m=+20.707991623" Apr 17 17:24:31.375573 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.375498 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rkk2d" podStartSLOduration=10.775629648 podStartE2EDuration="20.375494425s" podCreationTimestamp="2026-04-17 17:24:11 +0000 UTC" firstStartedPulling="2026-04-17 17:24:12.432710748 +0000 UTC m=+1.765328373" lastFinishedPulling="2026-04-17 17:24:22.032575512 +0000 UTC m=+11.365193150" observedRunningTime="2026-04-17 17:24:31.360086101 +0000 UTC m=+20.692703770" watchObservedRunningTime="2026-04-17 17:24:31.375494425 +0000 UTC m=+20.708112072" Apr 17 17:24:31.416604 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.416556 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f54jc" podStartSLOduration=2.840921632 podStartE2EDuration="20.416544713s" podCreationTimestamp="2026-04-17 17:24:11 +0000 UTC" firstStartedPulling="2026-04-17 17:24:12.455412812 +0000 UTC m=+1.788030436" lastFinishedPulling="2026-04-17 17:24:30.031035879 +0000 UTC m=+19.363653517" observedRunningTime="2026-04-17 17:24:31.416097545 +0000 UTC m=+20.748715190" watchObservedRunningTime="2026-04-17 17:24:31.416544713 +0000 UTC m=+20.749162359" Apr 17 17:24:31.748481 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.748393 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-gnmlr" Apr 17 17:24:31.749254 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:31.749232 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-gnmlr" Apr 17 17:24:32.004730 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:32.004702 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:24:32.160607 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:32.160510 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:24:32.004726046Z","UUID":"cab22954-d81e-47b0-a4e1-4542ab35831d","Handler":null,"Name":"","Endpoint":""} Apr 17 17:24:32.163349 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:32.163327 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:24:32.163349 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:32.163354 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:24:32.189197 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:32.189163 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:32.189358 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:32.189278 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:32.353949 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:32.353865 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" event={"ID":"03e6b129-19be-4d65-adaa-3b9d0e336aba","Type":"ContainerStarted","Data":"2d69a3df3ddbfd13533d0f8006fa67933991884fa4c747ea377737e45a7fd977"} Apr 17 17:24:32.355900 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:32.355872 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m7bkv" event={"ID":"913b5ebf-9596-434b-8ff9-ccc478779f3c","Type":"ContainerStarted","Data":"cfb05d11dbb48eda5d2e8455a06c431ba5c08c05cda8c3eb43cd4d9a7528e661"} Apr 17 17:24:32.356299 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:32.356278 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-gnmlr" Apr 17 17:24:32.356815 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:32.356793 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-gnmlr" Apr 17 17:24:32.371379 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:32.371331 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-m7bkv" podStartSLOduration=3.847989172 podStartE2EDuration="21.371318764s" podCreationTimestamp="2026-04-17 17:24:11 +0000 UTC" firstStartedPulling="2026-04-17 17:24:12.466778891 +0000 UTC m=+1.799396521" lastFinishedPulling="2026-04-17 17:24:29.990108478 +0000 UTC m=+19.322726113" observedRunningTime="2026-04-17 17:24:32.370649446 +0000 UTC m=+21.703267093" watchObservedRunningTime="2026-04-17 17:24:32.371318764 +0000 UTC m=+21.703936409" Apr 17 17:24:33.189274 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:33.189239 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:33.189505 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:33.189236 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:33.189505 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:33.189354 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:33.189505 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:33.189414 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:33.359379 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:33.359294 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" event={"ID":"727c55257da0e2451831c00f9fa5ff7d","Type":"ContainerStarted","Data":"d73d1c7dcc9a9c3c4a65dbb19c05ebae3a4c4a3df16e814ab0232104c3fdb8d2"} Apr 17 17:24:33.361388 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:33.361355 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" event={"ID":"03e6b129-19be-4d65-adaa-3b9d0e336aba","Type":"ContainerStarted","Data":"e6247806cbeab9eb95eaa92aefdb8d9c88900b214809cdba240ceb107c3519e9"} Apr 17 17:24:33.367274 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:33.367244 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" event={"ID":"0680fa77-380e-4eb4-acfb-45207e761c9b","Type":"ContainerStarted","Data":"5277e402c3b3f1d3617826245e5ddc0395a89b44ac02c5da568815250205f738"} Apr 17 17:24:33.375336 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:33.375295 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-59.ec2.internal" podStartSLOduration=21.375279798 podStartE2EDuration="21.375279798s" podCreationTimestamp="2026-04-17 17:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:33.374558764 +0000 UTC m=+22.707176410" watchObservedRunningTime="2026-04-17 17:24:33.375279798 +0000 UTC m=+22.707897444" Apr 17 17:24:33.394624 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:33.394574 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lzrd4" podStartSLOduration=2.1343493159999998 podStartE2EDuration="22.394559575s" podCreationTimestamp="2026-04-17 17:24:11 +0000 UTC" firstStartedPulling="2026-04-17 17:24:12.424276753 +0000 UTC m=+1.756894376" lastFinishedPulling="2026-04-17 17:24:32.684486994 +0000 UTC m=+22.017104635" observedRunningTime="2026-04-17 17:24:33.394551552 +0000 UTC m=+22.727169198" watchObservedRunningTime="2026-04-17 17:24:33.394559575 +0000 UTC m=+22.727177198" Apr 17 17:24:34.190103 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:34.190074 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:34.190300 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:34.190203 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:35.189547 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:35.189414 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:35.189914 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:35.189483 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:35.189914 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:35.189671 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:35.189914 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:35.189717 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:35.374860 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:35.374804 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" event={"ID":"0680fa77-380e-4eb4-acfb-45207e761c9b","Type":"ContainerStarted","Data":"52ea3a2155cbf5d956bbfda8b8b1764077955a3b6eabdc65bf645396738e2713"} Apr 17 17:24:35.375146 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:35.375115 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:35.375146 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:35.375138 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:35.389557 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:35.389533 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:35.424468 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:35.424389 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" podStartSLOduration=6.331301336 podStartE2EDuration="24.424373131s" podCreationTimestamp="2026-04-17 17:24:11 +0000 UTC" firstStartedPulling="2026-04-17 17:24:12.44283082 +0000 UTC m=+1.775448447" lastFinishedPulling="2026-04-17 17:24:30.535902613 +0000 UTC m=+19.868520242" observedRunningTime="2026-04-17 17:24:35.423569887 +0000 UTC m=+24.756187557" watchObservedRunningTime="2026-04-17 17:24:35.424373131 +0000 UTC m=+24.756990779" Apr 17 17:24:36.190136 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:36.190093 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:36.190980 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:36.190228 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:36.378063 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:36.377967 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:36.396707 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:36.396668 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:24:37.189486 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:37.189454 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:37.189659 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:37.189554 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:37.189659 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:37.189612 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:37.189775 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:37.189736 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:37.401577 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:37.401549 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kzg6t"] Apr 17 17:24:37.401950 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:37.401647 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:37.401950 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:37.401727 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:37.404754 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:37.404727 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bf9w9"] Apr 17 17:24:37.404895 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:37.404831 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:37.405017 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:37.404966 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:37.405583 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:37.405560 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-clsml"] Apr 17 17:24:37.405659 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:37.405648 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:37.405857 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:37.405742 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:38.384636 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:38.384603 2565 generic.go:358] "Generic (PLEG): container finished" podID="22fe4e50-13fd-4ae5-b9a6-1552184b400f" containerID="877a4667ce81232b773b3eded3a886f62b2dba37c9c13984e07051b6969b5bb5" exitCode=0 Apr 17 17:24:38.384807 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:38.384691 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hc4g" event={"ID":"22fe4e50-13fd-4ae5-b9a6-1552184b400f","Type":"ContainerDied","Data":"877a4667ce81232b773b3eded3a886f62b2dba37c9c13984e07051b6969b5bb5"} Apr 17 17:24:39.189195 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:39.189162 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:39.189588 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:39.189208 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:39.189588 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:39.189297 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:39.189588 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:39.189319 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:39.189588 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:39.189362 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:39.189588 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:39.189461 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:40.389717 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:40.389681 2565 generic.go:358] "Generic (PLEG): container finished" podID="22fe4e50-13fd-4ae5-b9a6-1552184b400f" containerID="2cb42fcad1074150c70c81bab8edba9164baaf64e995aa9aeef29b08cbe66756" exitCode=0 Apr 17 17:24:40.390130 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:40.389738 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hc4g" event={"ID":"22fe4e50-13fd-4ae5-b9a6-1552184b400f","Type":"ContainerDied","Data":"2cb42fcad1074150c70c81bab8edba9164baaf64e995aa9aeef29b08cbe66756"} Apr 17 17:24:41.189757 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:41.189722 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:41.189945 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:41.189826 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:41.189945 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:41.189899 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:41.190053 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:41.190022 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:41.190102 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:41.190070 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:41.190165 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:41.190144 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:41.396200 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:41.396114 2565 generic.go:358] "Generic (PLEG): container finished" podID="22fe4e50-13fd-4ae5-b9a6-1552184b400f" containerID="51c7ffe6c9f4de57242e0079df5d78c5dfa8961883c7e13ba73888288e2360b5" exitCode=0 Apr 17 17:24:41.396200 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:41.396164 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hc4g" event={"ID":"22fe4e50-13fd-4ae5-b9a6-1552184b400f","Type":"ContainerDied","Data":"51c7ffe6c9f4de57242e0079df5d78c5dfa8961883c7e13ba73888288e2360b5"} Apr 17 17:24:43.190240 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.190205 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:43.190693 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.190246 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:43.190693 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.190211 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:43.190693 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.190334 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bf9w9" podUID="7b024619-6a7c-49fc-b69c-a21fab2b9f5c" Apr 17 17:24:43.190693 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.190437 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-clsml" podUID="053402a3-0f05-4423-9697-95ba118cec9c" Apr 17 17:24:43.190693 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.190521 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kzg6t" podUID="cee37054-a7f8-4e3d-8733-e19de26444e7" Apr 17 17:24:43.509225 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.509190 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-59.ec2.internal" event="NodeReady" Apr 17 17:24:43.509399 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.509331 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:24:43.545924 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.545654 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ct24q"] Apr 17 17:24:43.549622 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.549557 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-596b4b48f5-ptwxj"] Apr 17 17:24:43.549793 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.549764 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ct24q" Apr 17 17:24:43.552486 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.552468 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr"] Apr 17 17:24:43.552644 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.552625 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.553554 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.553513 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:43.553554 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.553514 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:43.553897 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.553823 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-bn8zq\"" Apr 17 17:24:43.554697 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.554681 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mh4jk"] Apr 17 17:24:43.554881 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.554863 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:43.555246 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.554979 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:24:43.555246 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.555022 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vv5sf\"" Apr 17 17:24:43.557419 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.557396 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:24:43.557817 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.557796 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:24:43.558169 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.558149 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:24:43.559005 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.558552 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:24:43.559005 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.558801 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 17:24:43.560262 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.559264 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56"] Apr 17 17:24:43.560262 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.559486 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.578893 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.567176 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-564dfc4b58-br5kx"] Apr 17 17:24:43.581870 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.579640 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:24:43.581870 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.579872 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-lks9p\"" Apr 17 17:24:43.588979 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.587884 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:24:43.588979 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.588120 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hx8hn\"" Apr 17 17:24:43.589149 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.589090 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 17:24:43.589774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.589383 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 17:24:43.589774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.589537 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:24:43.589774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.589599 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 17:24:43.591873 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.591319 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 17:24:43.599058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.599039 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ct24q"] Apr 17 17:24:43.599195 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.599181 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-596b4b48f5-ptwxj"] Apr 17 17:24:43.599287 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.599277 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq"] Apr 17 17:24:43.599395 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.599377 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:24:43.602126 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.602095 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:43.602578 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.602559 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-brgkp\"" Apr 17 17:24:43.602789 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.602772 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 17:24:43.603116 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.603097 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:43.603554 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.603414 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-sb45j"] Apr 17 17:24:43.603554 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.603476 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.603554 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.603520 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" Apr 17 17:24:43.606523 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.606503 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:43.606875 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.606826 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9wpnv\"" Apr 17 17:24:43.607107 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.607024 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.607289 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.607261 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:43.607583 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.607490 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 17:24:43.608089 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.607885 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 17:24:43.608089 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.607913 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 17:24:43.608089 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.606935 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mh4jk"] Apr 17 17:24:43.608089 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.608021 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56"] Apr 17 17:24:43.608089 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.608038 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr"] Apr 17 17:24:43.608089 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.608041 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 17:24:43.608089 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.608072 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2"] Apr 17 17:24:43.610275 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.610253 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-pv52z\"" Apr 17 17:24:43.610407 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.610260 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 17:24:43.610782 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.610761 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 17:24:43.610885 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.610783 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 17:24:43.611390 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.611367 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:43.611917 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.611890 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-bvxcx\"" Apr 17 17:24:43.613069 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.612406 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 17:24:43.613069 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.612832 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 17:24:43.617096 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.616316 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 17:24:43.617096 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.616568 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:43.619086 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.619066 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp"] Apr 17 17:24:43.619601 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.619580 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:24:43.622374 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.622351 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 17:24:43.622494 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.622473 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 17:24:43.623104 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.622614 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-dxh7z\"" Apr 17 17:24:43.623104 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.623037 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 17:24:43.625307 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.625288 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-bgmm8"] Apr 17 17:24:43.625518 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.625500 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" Apr 17 17:24:43.628223 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.628204 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zlw54"] Apr 17 17:24:43.628324 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.628288 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bgmm8" Apr 17 17:24:43.628324 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.628307 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 17:24:43.628431 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.628207 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:43.629028 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.629010 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:43.629241 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.629221 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-n8jfp\"" Apr 17 17:24:43.629731 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.629703 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 17:24:43.630910 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.630892 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:24:43.631725 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.631707 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:24:43.631795 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.631775 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-grpjs\"" Apr 17 17:24:43.633616 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.633568 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-bound-sa-token\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.633616 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.633601 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n86rp\" (UniqueName: \"kubernetes.io/projected/f175c7b4-7e01-4c45-b098-27c87d4ba139-kube-api-access-n86rp\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.633738 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.633627 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:43.633738 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.633660 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6h6b\" (UniqueName: \"kubernetes.io/projected/19432bca-a912-4225-ab78-b273f482ec76-kube-api-access-c6h6b\") pod \"volume-data-source-validator-7c6cbb6c87-ct24q\" (UID: \"19432bca-a912-4225-ab78-b273f482ec76\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ct24q" Apr 17 17:24:43.633738 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.633689 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-trusted-ca\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.633738 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.633714 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f175c7b4-7e01-4c45-b098-27c87d4ba139-snapshots\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.633973 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.633774 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f175c7b4-7e01-4c45-b098-27c87d4ba139-tmp\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.633973 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.633873 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-image-registry-private-configuration\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.633973 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.633914 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-installation-pull-secrets\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.633973 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.633952 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f175c7b4-7e01-4c45-b098-27c87d4ba139-service-ca-bundle\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.634176 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.633974 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f175c7b4-7e01-4c45-b098-27c87d4ba139-serving-cert\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.634176 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.634044 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-ca-trust-extracted\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.634176 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.634070 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f175c7b4-7e01-4c45-b098-27c87d4ba139-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.634176 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.634094 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:43.634176 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.634134 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hrmh\" (UniqueName: \"kubernetes.io/projected/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-kube-api-access-5hrmh\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:43.634415 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.634181 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.634415 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.634205 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x79z\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-kube-api-access-2x79z\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.634415 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.634228 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-certificates\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.646385 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.646366 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r4jgw"] Apr 17 17:24:43.646550 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.646533 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:43.650783 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.650762 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cl7ph\"" Apr 17 17:24:43.650976 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.650955 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:24:43.651089 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.651073 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:24:43.653287 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.653268 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq"] Apr 17 17:24:43.653422 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.653294 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2"] Apr 17 17:24:43.653422 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.653307 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-564dfc4b58-br5kx"] Apr 17 17:24:43.653422 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.653317 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-sb45j"] Apr 17 17:24:43.653422 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.653329 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-bgmm8"] Apr 17 17:24:43.653422 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.653342 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zlw54"] Apr 17 17:24:43.653422 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.653353 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r4jgw"] Apr 17 17:24:43.653422 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.653365 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp"] Apr 17 17:24:43.653744 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.653467 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:24:43.656263 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.656238 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:24:43.656263 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.656254 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rb846\"" Apr 17 17:24:43.656471 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.656457 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:24:43.656588 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.656570 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:24:43.734691 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.734653 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hrmh\" (UniqueName: \"kubernetes.io/projected/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-kube-api-access-5hrmh\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:43.734892 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.734706 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x79z\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-kube-api-access-2x79z\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.734892 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.734741 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b697600b-a4a4-48b8-b42e-9965d51b283c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-h9fr2\" (UID: \"b697600b-a4a4-48b8-b42e-9965d51b283c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:24:43.734892 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.734772 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48466d02-4d96-4b89-b43e-1adc61971469-serving-cert\") pod \"console-operator-9d4b6777b-sb45j\" (UID: \"48466d02-4d96-4b89-b43e-1adc61971469\") " pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.734892 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.734795 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48466d02-4d96-4b89-b43e-1adc61971469-trusted-ca\") pod \"console-operator-9d4b6777b-sb45j\" (UID: \"48466d02-4d96-4b89-b43e-1adc61971469\") " pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.734892 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.734822 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.734892 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.734866 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30189955-9e22-4280-bbeb-b99c4bea9d98-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gq4sp\" (UID: \"30189955-9e22-4280-bbeb-b99c4bea9d98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" Apr 17 17:24:43.735196 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.734920 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-bound-sa-token\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.735196 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.734958 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtj72\" (UniqueName: \"kubernetes.io/projected/8cd66734-7425-4cd0-99cb-727b3059a1ed-kube-api-access-jtj72\") pod \"ingress-canary-r4jgw\" (UID: \"8cd66734-7425-4cd0-99cb-727b3059a1ed\") " pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:24:43.735196 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.734992 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-trusted-ca\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.735196 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735016 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f175c7b4-7e01-4c45-b098-27c87d4ba139-snapshots\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.735196 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735054 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b20915a3-9438-4b66-b1a3-83c753d5524b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qqbrq\" (UID: \"b20915a3-9438-4b66-b1a3-83c753d5524b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" Apr 17 17:24:43.735196 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735085 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20915a3-9438-4b66-b1a3-83c753d5524b-config\") pod \"service-ca-operator-d6fc45fc5-qqbrq\" (UID: \"b20915a3-9438-4b66-b1a3-83c753d5524b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" Apr 17 17:24:43.735196 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735114 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert\") pod \"ingress-canary-r4jgw\" (UID: \"8cd66734-7425-4cd0-99cb-727b3059a1ed\") " pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:24:43.735196 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735157 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.735196 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735183 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h9fr2\" (UID: \"b697600b-a4a4-48b8-b42e-9965d51b283c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:24:43.735576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735208 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-installation-pull-secrets\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.735576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735258 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f175c7b4-7e01-4c45-b098-27c87d4ba139-serving-cert\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.735576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735301 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:43.735576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735338 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:43.735576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735375 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/40520684-848d-46b0-8288-7c708075eda2-tmp-dir\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:43.735576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735415 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-ca-trust-extracted\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.735576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735442 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-969fh\" (UniqueName: \"kubernetes.io/projected/fba1d80e-239c-4b35-afc0-5052c340a9ee-kube-api-access-969fh\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.735576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735467 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgb2f\" (UniqueName: \"kubernetes.io/projected/40520684-848d-46b0-8288-7c708075eda2-kube-api-access-mgb2f\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:43.735576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735498 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82d56\" (UID: \"d61a14c7-8569-4165-aea3-ca07c214caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:24:43.735576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735525 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.735576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735549 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxlzs\" (UniqueName: \"kubernetes.io/projected/d61a14c7-8569-4165-aea3-ca07c214caef-kube-api-access-vxlzs\") pod \"cluster-samples-operator-6dc5bdb6b4-82d56\" (UID: \"d61a14c7-8569-4165-aea3-ca07c214caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:24:43.735576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735578 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-certificates\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735610 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n86rp\" (UniqueName: \"kubernetes.io/projected/f175c7b4-7e01-4c45-b098-27c87d4ba139-kube-api-access-n86rp\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735644 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7zqh\" (UniqueName: \"kubernetes.io/projected/f3f1d02f-e4b4-401b-b23c-adc6bc3f2616-kube-api-access-h7zqh\") pod \"network-check-source-8894fc9bd-bgmm8\" (UID: \"f3f1d02f-e4b4-401b-b23c-adc6bc3f2616\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bgmm8" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735670 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f175c7b4-7e01-4c45-b098-27c87d4ba139-snapshots\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735674 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6h6b\" (UniqueName: \"kubernetes.io/projected/19432bca-a912-4225-ab78-b273f482ec76-kube-api-access-c6h6b\") pod \"volume-data-source-validator-7c6cbb6c87-ct24q\" (UID: \"19432bca-a912-4225-ab78-b273f482ec76\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ct24q" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735739 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-default-certificate\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735767 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-stats-auth\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735817 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735865 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f175c7b4-7e01-4c45-b098-27c87d4ba139-tmp\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735895 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30189955-9e22-4280-bbeb-b99c4bea9d98-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gq4sp\" (UID: \"30189955-9e22-4280-bbeb-b99c4bea9d98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735922 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmsjk\" (UniqueName: \"kubernetes.io/projected/48466d02-4d96-4b89-b43e-1adc61971469-kube-api-access-bmsjk\") pod \"console-operator-9d4b6777b-sb45j\" (UID: \"48466d02-4d96-4b89-b43e-1adc61971469\") " pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735955 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-image-registry-private-configuration\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.735973 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.735991 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-596b4b48f5-ptwxj: secret "image-registry-tls" not found Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.735993 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl7xp\" (UniqueName: \"kubernetes.io/projected/30189955-9e22-4280-bbeb-b99c4bea9d98-kube-api-access-hl7xp\") pod \"kube-storage-version-migrator-operator-6769c5d45-gq4sp\" (UID: \"30189955-9e22-4280-bbeb-b99c4bea9d98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.736019 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48466d02-4d96-4b89-b43e-1adc61971469-config\") pod \"console-operator-9d4b6777b-sb45j\" (UID: \"48466d02-4d96-4b89-b43e-1adc61971469\") " pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.736153 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.736043 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls podName:72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:44.236025677 +0000 UTC m=+33.568643300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls") pod "image-registry-596b4b48f5-ptwxj" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0") : secret "image-registry-tls" not found Apr 17 17:24:43.736913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.736090 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f175c7b4-7e01-4c45-b098-27c87d4ba139-service-ca-bundle\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.736913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.736197 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-trusted-ca\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.736913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.736435 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:43.736913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.736555 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-ca-trust-extracted\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.736913 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.736632 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:24:43.736913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.736659 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f175c7b4-7e01-4c45-b098-27c87d4ba139-tmp\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.736913 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.736677 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls podName:f65f4a7f-b841-425d-b6cf-02bd9f48fd69 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:44.236666161 +0000 UTC m=+33.569283784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zl5fr" (UID: "f65f4a7f-b841-425d-b6cf-02bd9f48fd69") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:24:43.736913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.736751 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f175c7b4-7e01-4c45-b098-27c87d4ba139-service-ca-bundle\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.736913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.736759 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f175c7b4-7e01-4c45-b098-27c87d4ba139-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.736913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.736805 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxbnt\" (UniqueName: \"kubernetes.io/projected/b20915a3-9438-4b66-b1a3-83c753d5524b-kube-api-access-gxbnt\") pod \"service-ca-operator-d6fc45fc5-qqbrq\" (UID: \"b20915a3-9438-4b66-b1a3-83c753d5524b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" Apr 17 17:24:43.736913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.736866 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40520684-848d-46b0-8288-7c708075eda2-config-volume\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:43.736913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.736882 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-certificates\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.737599 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.737555 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f175c7b4-7e01-4c45-b098-27c87d4ba139-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.740190 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.740166 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f175c7b4-7e01-4c45-b098-27c87d4ba139-serving-cert\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.740299 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.740215 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-installation-pull-secrets\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.740488 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.740468 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-image-registry-private-configuration\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.755868 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.753703 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hrmh\" (UniqueName: \"kubernetes.io/projected/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-kube-api-access-5hrmh\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:43.760168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.760106 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-bound-sa-token\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.760168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.760132 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6h6b\" (UniqueName: \"kubernetes.io/projected/19432bca-a912-4225-ab78-b273f482ec76-kube-api-access-c6h6b\") pod \"volume-data-source-validator-7c6cbb6c87-ct24q\" (UID: \"19432bca-a912-4225-ab78-b273f482ec76\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ct24q" Apr 17 17:24:43.760346 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.760223 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x79z\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-kube-api-access-2x79z\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:43.761240 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.761216 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n86rp\" (UniqueName: \"kubernetes.io/projected/f175c7b4-7e01-4c45-b098-27c87d4ba139-kube-api-access-n86rp\") pod \"insights-operator-585dfdc468-mh4jk\" (UID: \"f175c7b4-7e01-4c45-b098-27c87d4ba139\") " pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.838281 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838239 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b697600b-a4a4-48b8-b42e-9965d51b283c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-h9fr2\" (UID: \"b697600b-a4a4-48b8-b42e-9965d51b283c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:24:43.838430 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838289 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48466d02-4d96-4b89-b43e-1adc61971469-serving-cert\") pod \"console-operator-9d4b6777b-sb45j\" (UID: \"48466d02-4d96-4b89-b43e-1adc61971469\") " pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.838430 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838315 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48466d02-4d96-4b89-b43e-1adc61971469-trusted-ca\") pod \"console-operator-9d4b6777b-sb45j\" (UID: \"48466d02-4d96-4b89-b43e-1adc61971469\") " pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.838430 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838340 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.838430 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838366 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30189955-9e22-4280-bbeb-b99c4bea9d98-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gq4sp\" (UID: \"30189955-9e22-4280-bbeb-b99c4bea9d98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" Apr 17 17:24:43.838430 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838399 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtj72\" (UniqueName: \"kubernetes.io/projected/8cd66734-7425-4cd0-99cb-727b3059a1ed-kube-api-access-jtj72\") pod \"ingress-canary-r4jgw\" (UID: \"8cd66734-7425-4cd0-99cb-727b3059a1ed\") " pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:24:43.838674 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838432 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b20915a3-9438-4b66-b1a3-83c753d5524b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qqbrq\" (UID: \"b20915a3-9438-4b66-b1a3-83c753d5524b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" Apr 17 17:24:43.838674 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838457 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20915a3-9438-4b66-b1a3-83c753d5524b-config\") pod \"service-ca-operator-d6fc45fc5-qqbrq\" (UID: \"b20915a3-9438-4b66-b1a3-83c753d5524b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" Apr 17 17:24:43.838674 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838483 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert\") pod \"ingress-canary-r4jgw\" (UID: \"8cd66734-7425-4cd0-99cb-727b3059a1ed\") " pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:24:43.838674 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.838499 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:24:43.838674 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838522 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.838674 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838557 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h9fr2\" (UID: \"b697600b-a4a4-48b8-b42e-9965d51b283c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:24:43.838674 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.838579 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs podName:fba1d80e-239c-4b35-afc0-5052c340a9ee nodeName:}" failed. No retries permitted until 2026-04-17 17:24:44.338557862 +0000 UTC m=+33.671175509 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs") pod "router-default-564dfc4b58-br5kx" (UID: "fba1d80e-239c-4b35-afc0-5052c340a9ee") : secret "router-metrics-certs-default" not found Apr 17 17:24:43.838674 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.838638 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 17:24:43.838674 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838643 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/40520684-848d-46b0-8288-7c708075eda2-tmp-dir\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:43.838674 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838675 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.838684 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert podName:b697600b-a4a4-48b8-b42e-9965d51b283c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:44.33866911 +0000 UTC m=+33.671286752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h9fr2" (UID: "b697600b-a4a4-48b8-b42e-9965d51b283c") : secret "networking-console-plugin-cert" not found Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838712 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-969fh\" (UniqueName: \"kubernetes.io/projected/fba1d80e-239c-4b35-afc0-5052c340a9ee-kube-api-access-969fh\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.838739 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.838774 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs podName:053402a3-0f05-4423-9697-95ba118cec9c nodeName:}" failed. No retries permitted until 2026-04-17 17:25:15.838762974 +0000 UTC m=+65.171380611 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs") pod "network-metrics-daemon-clsml" (UID: "053402a3-0f05-4423-9697-95ba118cec9c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838738 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgb2f\" (UniqueName: \"kubernetes.io/projected/40520684-848d-46b0-8288-7c708075eda2-kube-api-access-mgb2f\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838812 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82d56\" (UID: \"d61a14c7-8569-4165-aea3-ca07c214caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838874 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxlzs\" (UniqueName: \"kubernetes.io/projected/d61a14c7-8569-4165-aea3-ca07c214caef-kube-api-access-vxlzs\") pod \"cluster-samples-operator-6dc5bdb6b4-82d56\" (UID: \"d61a14c7-8569-4165-aea3-ca07c214caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838908 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7zqh\" (UniqueName: \"kubernetes.io/projected/f3f1d02f-e4b4-401b-b23c-adc6bc3f2616-kube-api-access-h7zqh\") pod \"network-check-source-8894fc9bd-bgmm8\" (UID: \"f3f1d02f-e4b4-401b-b23c-adc6bc3f2616\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bgmm8" Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838944 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-default-certificate\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.838951 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838970 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-stats-auth\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.838995 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.839004 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert podName:8cd66734-7425-4cd0-99cb-727b3059a1ed nodeName:}" failed. No retries permitted until 2026-04-17 17:24:44.338990895 +0000 UTC m=+33.671608532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert") pod "ingress-canary-r4jgw" (UID: "8cd66734-7425-4cd0-99cb-727b3059a1ed") : secret "canary-serving-cert" not found Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.839053 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:43.839184 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.839119 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b697600b-a4a4-48b8-b42e-9965d51b283c-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-h9fr2\" (UID: \"b697600b-a4a4-48b8-b42e-9965d51b283c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.839199 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48466d02-4d96-4b89-b43e-1adc61971469-trusted-ca\") pod \"console-operator-9d4b6777b-sb45j\" (UID: \"48466d02-4d96-4b89-b43e-1adc61971469\") " pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.839301 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle podName:fba1d80e-239c-4b35-afc0-5052c340a9ee nodeName:}" failed. No retries permitted until 2026-04-17 17:24:44.339280924 +0000 UTC m=+33.671898554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle") pod "router-default-564dfc4b58-br5kx" (UID: "fba1d80e-239c-4b35-afc0-5052c340a9ee") : configmap references non-existent config key: service-ca.crt Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.839326 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls podName:40520684-848d-46b0-8288-7c708075eda2 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:44.33931559 +0000 UTC m=+33.671933213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls") pod "dns-default-zlw54" (UID: "40520684-848d-46b0-8288-7c708075eda2") : secret "dns-default-metrics-tls" not found Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.839370 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30189955-9e22-4280-bbeb-b99c4bea9d98-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gq4sp\" (UID: \"30189955-9e22-4280-bbeb-b99c4bea9d98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.839400 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmsjk\" (UniqueName: \"kubernetes.io/projected/48466d02-4d96-4b89-b43e-1adc61971469-kube-api-access-bmsjk\") pod \"console-operator-9d4b6777b-sb45j\" (UID: \"48466d02-4d96-4b89-b43e-1adc61971469\") " pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.839434 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl7xp\" (UniqueName: \"kubernetes.io/projected/30189955-9e22-4280-bbeb-b99c4bea9d98-kube-api-access-hl7xp\") pod \"kube-storage-version-migrator-operator-6769c5d45-gq4sp\" (UID: \"30189955-9e22-4280-bbeb-b99c4bea9d98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.839438 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/40520684-848d-46b0-8288-7c708075eda2-tmp-dir\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.839523 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:43.839575 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls podName:d61a14c7-8569-4165-aea3-ca07c214caef nodeName:}" failed. No retries permitted until 2026-04-17 17:24:44.339560592 +0000 UTC m=+33.672178216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-82d56" (UID: "d61a14c7-8569-4165-aea3-ca07c214caef") : secret "samples-operator-tls" not found Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.839606 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48466d02-4d96-4b89-b43e-1adc61971469-config\") pod \"console-operator-9d4b6777b-sb45j\" (UID: \"48466d02-4d96-4b89-b43e-1adc61971469\") " pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.839663 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxbnt\" (UniqueName: \"kubernetes.io/projected/b20915a3-9438-4b66-b1a3-83c753d5524b-kube-api-access-gxbnt\") pod \"service-ca-operator-d6fc45fc5-qqbrq\" (UID: \"b20915a3-9438-4b66-b1a3-83c753d5524b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.839690 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40520684-848d-46b0-8288-7c708075eda2-config-volume\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.839703 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30189955-9e22-4280-bbeb-b99c4bea9d98-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gq4sp\" (UID: \"30189955-9e22-4280-bbeb-b99c4bea9d98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" Apr 17 17:24:43.839922 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.839714 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b20915a3-9438-4b66-b1a3-83c753d5524b-config\") pod \"service-ca-operator-d6fc45fc5-qqbrq\" (UID: \"b20915a3-9438-4b66-b1a3-83c753d5524b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" Apr 17 17:24:43.840761 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.840226 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40520684-848d-46b0-8288-7c708075eda2-config-volume\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:43.840761 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.840396 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48466d02-4d96-4b89-b43e-1adc61971469-config\") pod \"console-operator-9d4b6777b-sb45j\" (UID: \"48466d02-4d96-4b89-b43e-1adc61971469\") " pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.841677 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.841650 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48466d02-4d96-4b89-b43e-1adc61971469-serving-cert\") pod \"console-operator-9d4b6777b-sb45j\" (UID: \"48466d02-4d96-4b89-b43e-1adc61971469\") " pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.841880 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.841857 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-default-certificate\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.841994 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.841974 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-stats-auth\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.842418 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.842400 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30189955-9e22-4280-bbeb-b99c4bea9d98-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gq4sp\" (UID: \"30189955-9e22-4280-bbeb-b99c4bea9d98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" Apr 17 17:24:43.842671 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.842654 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b20915a3-9438-4b66-b1a3-83c753d5524b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qqbrq\" (UID: \"b20915a3-9438-4b66-b1a3-83c753d5524b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" Apr 17 17:24:43.848981 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.848704 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtj72\" (UniqueName: \"kubernetes.io/projected/8cd66734-7425-4cd0-99cb-727b3059a1ed-kube-api-access-jtj72\") pod \"ingress-canary-r4jgw\" (UID: \"8cd66734-7425-4cd0-99cb-727b3059a1ed\") " pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:24:43.848981 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.848942 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-969fh\" (UniqueName: \"kubernetes.io/projected/fba1d80e-239c-4b35-afc0-5052c340a9ee-kube-api-access-969fh\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:43.849248 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.849219 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmsjk\" (UniqueName: \"kubernetes.io/projected/48466d02-4d96-4b89-b43e-1adc61971469-kube-api-access-bmsjk\") pod \"console-operator-9d4b6777b-sb45j\" (UID: \"48466d02-4d96-4b89-b43e-1adc61971469\") " pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.849674 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.849629 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl7xp\" (UniqueName: \"kubernetes.io/projected/30189955-9e22-4280-bbeb-b99c4bea9d98-kube-api-access-hl7xp\") pod \"kube-storage-version-migrator-operator-6769c5d45-gq4sp\" (UID: \"30189955-9e22-4280-bbeb-b99c4bea9d98\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" Apr 17 17:24:43.849873 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.849830 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgb2f\" (UniqueName: \"kubernetes.io/projected/40520684-848d-46b0-8288-7c708075eda2-kube-api-access-mgb2f\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:43.850153 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.850134 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7zqh\" (UniqueName: \"kubernetes.io/projected/f3f1d02f-e4b4-401b-b23c-adc6bc3f2616-kube-api-access-h7zqh\") pod \"network-check-source-8894fc9bd-bgmm8\" (UID: \"f3f1d02f-e4b4-401b-b23c-adc6bc3f2616\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bgmm8" Apr 17 17:24:43.850621 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.850520 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxlzs\" (UniqueName: \"kubernetes.io/projected/d61a14c7-8569-4165-aea3-ca07c214caef-kube-api-access-vxlzs\") pod \"cluster-samples-operator-6dc5bdb6b4-82d56\" (UID: \"d61a14c7-8569-4165-aea3-ca07c214caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:24:43.850736 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.850717 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxbnt\" (UniqueName: \"kubernetes.io/projected/b20915a3-9438-4b66-b1a3-83c753d5524b-kube-api-access-gxbnt\") pod \"service-ca-operator-d6fc45fc5-qqbrq\" (UID: \"b20915a3-9438-4b66-b1a3-83c753d5524b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" Apr 17 17:24:43.888761 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.888730 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ct24q" Apr 17 17:24:43.920873 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.920755 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-mh4jk" Apr 17 17:24:43.936679 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.936651 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" Apr 17 17:24:43.941783 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.941070 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbwm\" (UniqueName: \"kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm\") pod \"network-check-target-kzg6t\" (UID: \"cee37054-a7f8-4e3d-8733-e19de26444e7\") " pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:43.944392 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.944346 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwbwm\" (UniqueName: \"kubernetes.io/projected/cee37054-a7f8-4e3d-8733-e19de26444e7-kube-api-access-pwbwm\") pod \"network-check-target-kzg6t\" (UID: \"cee37054-a7f8-4e3d-8733-e19de26444e7\") " pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:43.951292 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.951266 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:24:43.967037 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.966554 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" Apr 17 17:24:43.973461 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:43.973061 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bgmm8" Apr 17 17:24:44.077864 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.077489 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ct24q"] Apr 17 17:24:44.088778 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:44.088695 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19432bca_a912_4225_ab78_b273f482ec76.slice/crio-a89dc521b4d5171142d68b0282298a279b3eebfc33c1c91d1ad437b24e8bdfbc WatchSource:0}: Error finding container a89dc521b4d5171142d68b0282298a279b3eebfc33c1c91d1ad437b24e8bdfbc: Status 404 returned error can't find the container with id a89dc521b4d5171142d68b0282298a279b3eebfc33c1c91d1ad437b24e8bdfbc Apr 17 17:24:44.098894 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.096241 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mh4jk"] Apr 17 17:24:44.108197 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:44.108166 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf175c7b4_7e01_4c45_b098_27c87d4ba139.slice/crio-553c8a28e938d0089a8c2a0c3be9e39d3044dc11aa6b79fada45edc01abc178e WatchSource:0}: Error finding container 553c8a28e938d0089a8c2a0c3be9e39d3044dc11aa6b79fada45edc01abc178e: Status 404 returned error can't find the container with id 553c8a28e938d0089a8c2a0c3be9e39d3044dc11aa6b79fada45edc01abc178e Apr 17 17:24:44.128005 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.127954 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq"] Apr 17 17:24:44.154092 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.154068 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-sb45j"] Apr 17 17:24:44.156466 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:44.156430 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48466d02_4d96_4b89_b43e_1adc61971469.slice/crio-40be3415dce80646b203f4c3f0633d381df40f2413b853fe203f8aae05dcb681 WatchSource:0}: Error finding container 40be3415dce80646b203f4c3f0633d381df40f2413b853fe203f8aae05dcb681: Status 404 returned error can't find the container with id 40be3415dce80646b203f4c3f0633d381df40f2413b853fe203f8aae05dcb681 Apr 17 17:24:44.170239 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.170212 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp"] Apr 17 17:24:44.173232 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:44.173209 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30189955_9e22_4280_bbeb_b99c4bea9d98.slice/crio-76d6fb376e02b730475d7a878242e3438285bf31fbcc1e63b2d29f796dd569a2 WatchSource:0}: Error finding container 76d6fb376e02b730475d7a878242e3438285bf31fbcc1e63b2d29f796dd569a2: Status 404 returned error can't find the container with id 76d6fb376e02b730475d7a878242e3438285bf31fbcc1e63b2d29f796dd569a2 Apr 17 17:24:44.181238 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.181211 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-bgmm8"] Apr 17 17:24:44.184069 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:44.184043 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3f1d02f_e4b4_401b_b23c_adc6bc3f2616.slice/crio-68a565acd55ffb0b56f90b2fc324d0b5714641842595dc8f47b916610c4e5a90 WatchSource:0}: Error finding container 68a565acd55ffb0b56f90b2fc324d0b5714641842595dc8f47b916610c4e5a90: Status 404 returned error can't find the container with id 68a565acd55ffb0b56f90b2fc324d0b5714641842595dc8f47b916610c4e5a90 Apr 17 17:24:44.243981 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.243942 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:44.244626 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.244018 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:44.244626 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.244144 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:24:44.244626 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.244225 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls podName:f65f4a7f-b841-425d-b6cf-02bd9f48fd69 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:45.244208989 +0000 UTC m=+34.576826613 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zl5fr" (UID: "f65f4a7f-b841-425d-b6cf-02bd9f48fd69") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:24:44.244626 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.244150 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:44.244626 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.244253 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-596b4b48f5-ptwxj: secret "image-registry-tls" not found Apr 17 17:24:44.244626 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.244293 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls podName:72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:45.244281344 +0000 UTC m=+34.576898975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls") pod "image-registry-596b4b48f5-ptwxj" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0") : secret "image-registry-tls" not found Apr 17 17:24:44.345319 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.345219 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h9fr2\" (UID: \"b697600b-a4a4-48b8-b42e-9965d51b283c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:24:44.345476 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.345359 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 17:24:44.345476 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.345395 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82d56\" (UID: \"d61a14c7-8569-4165-aea3-ca07c214caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:24:44.345476 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.345429 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert podName:b697600b-a4a4-48b8-b42e-9965d51b283c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:45.345411341 +0000 UTC m=+34.678028977 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h9fr2" (UID: "b697600b-a4a4-48b8-b42e-9965d51b283c") : secret "networking-console-plugin-cert" not found Apr 17 17:24:44.345476 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.345475 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:24:44.345644 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.345478 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:44.345644 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.345513 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls podName:d61a14c7-8569-4165-aea3-ca07c214caef nodeName:}" failed. No retries permitted until 2026-04-17 17:24:45.345500739 +0000 UTC m=+34.678118364 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-82d56" (UID: "d61a14c7-8569-4165-aea3-ca07c214caef") : secret "samples-operator-tls" not found Apr 17 17:24:44.345644 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.345546 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:44.345644 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.345572 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:44.345644 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.345578 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls podName:40520684-848d-46b0-8288-7c708075eda2 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:45.34556822 +0000 UTC m=+34.678185855 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls") pod "dns-default-zlw54" (UID: "40520684-848d-46b0-8288-7c708075eda2") : secret "dns-default-metrics-tls" not found Apr 17 17:24:44.345644 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.345621 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:24:44.345644 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.345631 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert\") pod \"ingress-canary-r4jgw\" (UID: \"8cd66734-7425-4cd0-99cb-727b3059a1ed\") " pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:24:44.345903 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.345653 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs podName:fba1d80e-239c-4b35-afc0-5052c340a9ee nodeName:}" failed. No retries permitted until 2026-04-17 17:24:45.345643778 +0000 UTC m=+34.678261402 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs") pod "router-default-564dfc4b58-br5kx" (UID: "fba1d80e-239c-4b35-afc0-5052c340a9ee") : secret "router-metrics-certs-default" not found Apr 17 17:24:44.345903 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.345688 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:44.345903 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.345692 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:44.345903 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.345716 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert podName:8cd66734-7425-4cd0-99cb-727b3059a1ed nodeName:}" failed. No retries permitted until 2026-04-17 17:24:45.345706683 +0000 UTC m=+34.678324328 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert") pod "ingress-canary-r4jgw" (UID: "8cd66734-7425-4cd0-99cb-727b3059a1ed") : secret "canary-serving-cert" not found Apr 17 17:24:44.345903 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:44.345795 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle podName:fba1d80e-239c-4b35-afc0-5052c340a9ee nodeName:}" failed. No retries permitted until 2026-04-17 17:24:45.345773586 +0000 UTC m=+34.678391209 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle") pod "router-default-564dfc4b58-br5kx" (UID: "fba1d80e-239c-4b35-afc0-5052c340a9ee") : configmap references non-existent config key: service-ca.crt Apr 17 17:24:44.402144 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.402097 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mh4jk" event={"ID":"f175c7b4-7e01-4c45-b098-27c87d4ba139","Type":"ContainerStarted","Data":"553c8a28e938d0089a8c2a0c3be9e39d3044dc11aa6b79fada45edc01abc178e"} Apr 17 17:24:44.403192 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.403164 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" event={"ID":"30189955-9e22-4280-bbeb-b99c4bea9d98","Type":"ContainerStarted","Data":"76d6fb376e02b730475d7a878242e3438285bf31fbcc1e63b2d29f796dd569a2"} Apr 17 17:24:44.404193 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.404170 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ct24q" event={"ID":"19432bca-a912-4225-ab78-b273f482ec76","Type":"ContainerStarted","Data":"a89dc521b4d5171142d68b0282298a279b3eebfc33c1c91d1ad437b24e8bdfbc"} Apr 17 17:24:44.405228 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.405200 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bgmm8" event={"ID":"f3f1d02f-e4b4-401b-b23c-adc6bc3f2616","Type":"ContainerStarted","Data":"68a565acd55ffb0b56f90b2fc324d0b5714641842595dc8f47b916610c4e5a90"} Apr 17 17:24:44.406324 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.406301 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" event={"ID":"b20915a3-9438-4b66-b1a3-83c753d5524b","Type":"ContainerStarted","Data":"230ce3a0996f95edd553fd6fba9c1f2dafad4f0c9d773fb63f8793f4d1493c2e"} Apr 17 17:24:44.407307 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:44.407286 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" event={"ID":"48466d02-4d96-4b89-b43e-1adc61971469","Type":"ContainerStarted","Data":"40be3415dce80646b203f4c3f0633d381df40f2413b853fe203f8aae05dcb681"} Apr 17 17:24:45.191199 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.189998 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:45.191199 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.190801 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:45.193436 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.191772 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:24:45.193436 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.193115 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:24:45.196328 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.194014 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m76bp\"" Apr 17 17:24:45.199238 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.197408 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:24:45.212381 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.210041 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6nwcp\"" Apr 17 17:24:45.232035 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.231621 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:45.256035 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.255712 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:45.256035 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.255782 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:45.256035 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.255887 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:24:45.256035 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.255942 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:45.256035 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.255957 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-596b4b48f5-ptwxj: secret "image-registry-tls" not found Apr 17 17:24:45.256035 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.255959 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls podName:f65f4a7f-b841-425d-b6cf-02bd9f48fd69 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:47.255939088 +0000 UTC m=+36.588556725 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zl5fr" (UID: "f65f4a7f-b841-425d-b6cf-02bd9f48fd69") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:24:45.256035 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.256016 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls podName:72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:47.256000128 +0000 UTC m=+36.588617759 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls") pod "image-registry-596b4b48f5-ptwxj" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0") : secret "image-registry-tls" not found Apr 17 17:24:45.358741 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.358693 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82d56\" (UID: \"d61a14c7-8569-4165-aea3-ca07c214caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:24:45.358883 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.358789 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:45.358940 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.358884 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:45.358940 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.358930 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert\") pod \"ingress-canary-r4jgw\" (UID: \"8cd66734-7425-4cd0-99cb-727b3059a1ed\") " pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:24:45.359098 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.358969 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:45.359098 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.359002 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h9fr2\" (UID: \"b697600b-a4a4-48b8-b42e-9965d51b283c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:24:45.359208 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.359135 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 17:24:45.359208 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.359201 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert podName:b697600b-a4a4-48b8-b42e-9965d51b283c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:47.359182505 +0000 UTC m=+36.691800131 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h9fr2" (UID: "b697600b-a4a4-48b8-b42e-9965d51b283c") : secret "networking-console-plugin-cert" not found Apr 17 17:24:45.360013 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.359608 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:24:45.360013 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.359661 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs podName:fba1d80e-239c-4b35-afc0-5052c340a9ee nodeName:}" failed. No retries permitted until 2026-04-17 17:24:47.359646064 +0000 UTC m=+36.692263690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs") pod "router-default-564dfc4b58-br5kx" (UID: "fba1d80e-239c-4b35-afc0-5052c340a9ee") : secret "router-metrics-certs-default" not found Apr 17 17:24:45.360013 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.359719 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:45.360013 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.359749 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert podName:8cd66734-7425-4cd0-99cb-727b3059a1ed nodeName:}" failed. No retries permitted until 2026-04-17 17:24:47.359738808 +0000 UTC m=+36.692356439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert") pod "ingress-canary-r4jgw" (UID: "8cd66734-7425-4cd0-99cb-727b3059a1ed") : secret "canary-serving-cert" not found Apr 17 17:24:45.360013 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.359832 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:24:45.360013 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.359861 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle podName:fba1d80e-239c-4b35-afc0-5052c340a9ee nodeName:}" failed. No retries permitted until 2026-04-17 17:24:47.359825391 +0000 UTC m=+36.692443033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle") pod "router-default-564dfc4b58-br5kx" (UID: "fba1d80e-239c-4b35-afc0-5052c340a9ee") : configmap references non-existent config key: service-ca.crt Apr 17 17:24:45.360013 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.359885 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls podName:d61a14c7-8569-4165-aea3-ca07c214caef nodeName:}" failed. No retries permitted until 2026-04-17 17:24:47.35987424 +0000 UTC m=+36.692491864 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-82d56" (UID: "d61a14c7-8569-4165-aea3-ca07c214caef") : secret "samples-operator-tls" not found Apr 17 17:24:45.360013 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.359939 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:45.360013 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:45.359986 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls podName:40520684-848d-46b0-8288-7c708075eda2 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:47.359972253 +0000 UTC m=+36.692589893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls") pod "dns-default-zlw54" (UID: "40520684-848d-46b0-8288-7c708075eda2") : secret "dns-default-metrics-tls" not found Apr 17 17:24:45.423635 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:45.423601 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kzg6t"] Apr 17 17:24:47.178361 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:47.178267 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:47.182654 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:47.182629 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b024619-6a7c-49fc-b69c-a21fab2b9f5c-original-pull-secret\") pod \"global-pull-secret-syncer-bf9w9\" (UID: \"7b024619-6a7c-49fc-b69c-a21fab2b9f5c\") " pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:47.278945 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:47.278901 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:47.279108 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:47.278973 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:47.279108 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.279069 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:24:47.279108 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.279087 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:47.279108 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.279112 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-596b4b48f5-ptwxj: secret "image-registry-tls" not found Apr 17 17:24:47.279287 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.279137 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls podName:f65f4a7f-b841-425d-b6cf-02bd9f48fd69 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.279117407 +0000 UTC m=+40.611735031 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zl5fr" (UID: "f65f4a7f-b841-425d-b6cf-02bd9f48fd69") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:24:47.279287 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.279167 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls podName:72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.279148961 +0000 UTC m=+40.611766588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls") pod "image-registry-596b4b48f5-ptwxj" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0") : secret "image-registry-tls" not found Apr 17 17:24:47.323697 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:47.323329 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bf9w9" Apr 17 17:24:47.379763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:47.379720 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:47.379763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:47.379776 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert\") pod \"ingress-canary-r4jgw\" (UID: \"8cd66734-7425-4cd0-99cb-727b3059a1ed\") " pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:24:47.380029 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:47.379797 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:47.380029 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:47.379826 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h9fr2\" (UID: \"b697600b-a4a4-48b8-b42e-9965d51b283c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:24:47.380029 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:47.379897 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82d56\" (UID: \"d61a14c7-8569-4165-aea3-ca07c214caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:24:47.380029 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.379919 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:47.380029 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:47.379946 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:47.380029 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.379923 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:24:47.380029 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.379980 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 17:24:47.380029 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.379990 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle podName:fba1d80e-239c-4b35-afc0-5052c340a9ee nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.37996859 +0000 UTC m=+40.712586231 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle") pod "router-default-564dfc4b58-br5kx" (UID: "fba1d80e-239c-4b35-afc0-5052c340a9ee") : configmap references non-existent config key: service-ca.crt Apr 17 17:24:47.380029 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.380015 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert podName:8cd66734-7425-4cd0-99cb-727b3059a1ed nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.380005629 +0000 UTC m=+40.712623253 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert") pod "ingress-canary-r4jgw" (UID: "8cd66734-7425-4cd0-99cb-727b3059a1ed") : secret "canary-serving-cert" not found Apr 17 17:24:47.380029 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.380029 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs podName:fba1d80e-239c-4b35-afc0-5052c340a9ee nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.380022589 +0000 UTC m=+40.712640213 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs") pod "router-default-564dfc4b58-br5kx" (UID: "fba1d80e-239c-4b35-afc0-5052c340a9ee") : secret "router-metrics-certs-default" not found Apr 17 17:24:47.380029 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.380033 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:47.380428 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.380043 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert podName:b697600b-a4a4-48b8-b42e-9965d51b283c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.380035944 +0000 UTC m=+40.712653569 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h9fr2" (UID: "b697600b-a4a4-48b8-b42e-9965d51b283c") : secret "networking-console-plugin-cert" not found Apr 17 17:24:47.380428 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.380056 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:24:47.380428 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.380063 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls podName:40520684-848d-46b0-8288-7c708075eda2 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.380052984 +0000 UTC m=+40.712670608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls") pod "dns-default-zlw54" (UID: "40520684-848d-46b0-8288-7c708075eda2") : secret "dns-default-metrics-tls" not found Apr 17 17:24:47.380428 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:47.380092 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls podName:d61a14c7-8569-4165-aea3-ca07c214caef nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.38007733 +0000 UTC m=+40.712694969 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-82d56" (UID: "d61a14c7-8569-4165-aea3-ca07c214caef") : secret "samples-operator-tls" not found Apr 17 17:24:48.206255 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:48.206212 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcee37054_a7f8_4e3d_8733_e19de26444e7.slice/crio-e3ab5c867a746a19d96ce15e933088ab686ac1bd76304830994575d4c8256237 WatchSource:0}: Error finding container e3ab5c867a746a19d96ce15e933088ab686ac1bd76304830994575d4c8256237: Status 404 returned error can't find the container with id e3ab5c867a746a19d96ce15e933088ab686ac1bd76304830994575d4c8256237 Apr 17 17:24:48.418169 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:48.418130 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kzg6t" event={"ID":"cee37054-a7f8-4e3d-8733-e19de26444e7","Type":"ContainerStarted","Data":"e3ab5c867a746a19d96ce15e933088ab686ac1bd76304830994575d4c8256237"} Apr 17 17:24:51.316755 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:51.316717 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:51.317213 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:51.316799 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:51.317213 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.316933 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:51.317213 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.316948 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-596b4b48f5-ptwxj: secret "image-registry-tls" not found Apr 17 17:24:51.317213 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.316998 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls podName:72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:59.316980821 +0000 UTC m=+48.649598447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls") pod "image-registry-596b4b48f5-ptwxj" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0") : secret "image-registry-tls" not found Apr 17 17:24:51.317425 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.317356 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:24:51.317475 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.317424 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls podName:f65f4a7f-b841-425d-b6cf-02bd9f48fd69 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:59.317407671 +0000 UTC m=+48.650025306 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zl5fr" (UID: "f65f4a7f-b841-425d-b6cf-02bd9f48fd69") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:51.418087 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:51.418172 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert\") pod \"ingress-canary-r4jgw\" (UID: \"8cd66734-7425-4cd0-99cb-727b3059a1ed\") " pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:51.418205 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:51.418238 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h9fr2\" (UID: \"b697600b-a4a4-48b8-b42e-9965d51b283c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.418251 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.418291 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:51.418307 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82d56\" (UID: \"d61a14c7-8569-4165-aea3-ca07c214caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.418325 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs podName:fba1d80e-239c-4b35-afc0-5052c340a9ee nodeName:}" failed. No retries permitted until 2026-04-17 17:24:59.418305884 +0000 UTC m=+48.750923511 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs") pod "router-default-564dfc4b58-br5kx" (UID: "fba1d80e-239c-4b35-afc0-5052c340a9ee") : secret "router-metrics-certs-default" not found Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.418369 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.418371 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert podName:8cd66734-7425-4cd0-99cb-727b3059a1ed nodeName:}" failed. No retries permitted until 2026-04-17 17:24:59.418349102 +0000 UTC m=+48.750966731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert") pod "ingress-canary-r4jgw" (UID: "8cd66734-7425-4cd0-99cb-727b3059a1ed") : secret "canary-serving-cert" not found Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.418421 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert podName:b697600b-a4a4-48b8-b42e-9965d51b283c nodeName:}" failed. No retries permitted until 2026-04-17 17:24:59.418408676 +0000 UTC m=+48.751026300 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h9fr2" (UID: "b697600b-a4a4-48b8-b42e-9965d51b283c") : secret "networking-console-plugin-cert" not found Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.418428 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:24:51.418433 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.418437 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle podName:fba1d80e-239c-4b35-afc0-5052c340a9ee nodeName:}" failed. No retries permitted until 2026-04-17 17:24:59.418428686 +0000 UTC m=+48.751046312 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle") pod "router-default-564dfc4b58-br5kx" (UID: "fba1d80e-239c-4b35-afc0-5052c340a9ee") : configmap references non-existent config key: service-ca.crt Apr 17 17:24:51.419212 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.418469 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls podName:d61a14c7-8569-4165-aea3-ca07c214caef nodeName:}" failed. No retries permitted until 2026-04-17 17:24:59.418456569 +0000 UTC m=+48.751074195 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-82d56" (UID: "d61a14c7-8569-4165-aea3-ca07c214caef") : secret "samples-operator-tls" not found Apr 17 17:24:51.419212 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:51.418500 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:51.419212 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.418593 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:51.419212 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:51.418625 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls podName:40520684-848d-46b0-8288-7c708075eda2 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:59.418614423 +0000 UTC m=+48.751232071 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls") pod "dns-default-zlw54" (UID: "40520684-848d-46b0-8288-7c708075eda2") : secret "dns-default-metrics-tls" not found Apr 17 17:24:53.454808 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:53.454746 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bf9w9"] Apr 17 17:24:53.460061 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:24:53.460030 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b024619_6a7c_49fc_b69c_a21fab2b9f5c.slice/crio-35a13a860415fb40c4e61b8561555ae70482566ddfc9d042330dfdac78da5005 WatchSource:0}: Error finding container 35a13a860415fb40c4e61b8561555ae70482566ddfc9d042330dfdac78da5005: Status 404 returned error can't find the container with id 35a13a860415fb40c4e61b8561555ae70482566ddfc9d042330dfdac78da5005 Apr 17 17:24:54.433551 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.432890 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bgmm8" event={"ID":"f3f1d02f-e4b4-401b-b23c-adc6bc3f2616","Type":"ContainerStarted","Data":"a591a5cc70aacdb143c6ce104bc8a174ae6e570ffae6772c746662138af5fbf8"} Apr 17 17:24:54.435251 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.435216 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" event={"ID":"b20915a3-9438-4b66-b1a3-83c753d5524b","Type":"ContainerStarted","Data":"40cd339ef8a37f399a50070729177724abcc600c6d34a6e45a6795314c768463"} Apr 17 17:24:54.438180 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.438161 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/0.log" Apr 17 17:24:54.438288 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.438199 2565 generic.go:358] "Generic (PLEG): container finished" podID="48466d02-4d96-4b89-b43e-1adc61971469" containerID="1f745da31b96b5cd4e2700d6793b9848b368495acb3eb58a36f709d0587a6e2b" exitCode=255 Apr 17 17:24:54.438491 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.438396 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" event={"ID":"48466d02-4d96-4b89-b43e-1adc61971469","Type":"ContainerDied","Data":"1f745da31b96b5cd4e2700d6793b9848b368495acb3eb58a36f709d0587a6e2b"} Apr 17 17:24:54.438592 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.438528 2565 scope.go:117] "RemoveContainer" containerID="1f745da31b96b5cd4e2700d6793b9848b368495acb3eb58a36f709d0587a6e2b" Apr 17 17:24:54.441871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.441481 2565 generic.go:358] "Generic (PLEG): container finished" podID="22fe4e50-13fd-4ae5-b9a6-1552184b400f" containerID="5bf19839159559ea0b082a732f1fe0f143e8c090aaeeb160cf418506b74d2cff" exitCode=0 Apr 17 17:24:54.441871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.441539 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hc4g" event={"ID":"22fe4e50-13fd-4ae5-b9a6-1552184b400f","Type":"ContainerDied","Data":"5bf19839159559ea0b082a732f1fe0f143e8c090aaeeb160cf418506b74d2cff"} Apr 17 17:24:54.446299 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.445803 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mh4jk" event={"ID":"f175c7b4-7e01-4c45-b098-27c87d4ba139","Type":"ContainerStarted","Data":"6fe59ef59a0e3e1aa8baf6ec8bd9f57c2f459c9b62bbb279974f1432c0228b3e"} Apr 17 17:24:54.451259 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.451213 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-bgmm8" podStartSLOduration=20.288861511 podStartE2EDuration="29.451198698s" podCreationTimestamp="2026-04-17 17:24:25 +0000 UTC" firstStartedPulling="2026-04-17 17:24:44.186359943 +0000 UTC m=+33.518977566" lastFinishedPulling="2026-04-17 17:24:53.348697125 +0000 UTC m=+42.681314753" observedRunningTime="2026-04-17 17:24:54.450470895 +0000 UTC m=+43.783088536" watchObservedRunningTime="2026-04-17 17:24:54.451198698 +0000 UTC m=+43.783816346" Apr 17 17:24:54.453922 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.453890 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" event={"ID":"30189955-9e22-4280-bbeb-b99c4bea9d98","Type":"ContainerStarted","Data":"c36cbfb3473e69e3298fd8f96dad1379c86131cdb73baf6b44d9c42551721689"} Apr 17 17:24:54.456696 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.456625 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ct24q" event={"ID":"19432bca-a912-4225-ab78-b273f482ec76","Type":"ContainerStarted","Data":"8dfa848a2a4d321177fed6234fc233869150018610d0ef8e0a930cb1cc4b8b7a"} Apr 17 17:24:54.458759 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.458691 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kzg6t" event={"ID":"cee37054-a7f8-4e3d-8733-e19de26444e7","Type":"ContainerStarted","Data":"98770c2efb467cd970a7c62c28cfb89f244a912501fad72e127a7f8ad02f4fd8"} Apr 17 17:24:54.460078 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.460056 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bf9w9" event={"ID":"7b024619-6a7c-49fc-b69c-a21fab2b9f5c","Type":"ContainerStarted","Data":"35a13a860415fb40c4e61b8561555ae70482566ddfc9d042330dfdac78da5005"} Apr 17 17:24:54.467481 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.467435 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-mh4jk" podStartSLOduration=20.24719285 podStartE2EDuration="29.467421795s" podCreationTimestamp="2026-04-17 17:24:25 +0000 UTC" firstStartedPulling="2026-04-17 17:24:44.113288465 +0000 UTC m=+33.445906112" lastFinishedPulling="2026-04-17 17:24:53.333517416 +0000 UTC m=+42.666135057" observedRunningTime="2026-04-17 17:24:54.466424753 +0000 UTC m=+43.799042401" watchObservedRunningTime="2026-04-17 17:24:54.467421795 +0000 UTC m=+43.800039441" Apr 17 17:24:54.508070 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.507749 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:24:54.508436 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.508351 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" podStartSLOduration=20.310214881 podStartE2EDuration="29.508335086s" podCreationTimestamp="2026-04-17 17:24:25 +0000 UTC" firstStartedPulling="2026-04-17 17:24:44.135343085 +0000 UTC m=+33.467960723" lastFinishedPulling="2026-04-17 17:24:53.333463293 +0000 UTC m=+42.666080928" observedRunningTime="2026-04-17 17:24:54.487945456 +0000 UTC m=+43.820563117" watchObservedRunningTime="2026-04-17 17:24:54.508335086 +0000 UTC m=+43.840952731" Apr 17 17:24:54.570239 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.568619 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ct24q" podStartSLOduration=20.330454381 podStartE2EDuration="29.568598465s" podCreationTimestamp="2026-04-17 17:24:25 +0000 UTC" firstStartedPulling="2026-04-17 17:24:44.093238261 +0000 UTC m=+33.425855899" lastFinishedPulling="2026-04-17 17:24:53.331382346 +0000 UTC m=+42.663999983" observedRunningTime="2026-04-17 17:24:54.567084203 +0000 UTC m=+43.899701850" watchObservedRunningTime="2026-04-17 17:24:54.568598465 +0000 UTC m=+43.901216111" Apr 17 17:24:54.628320 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.622511 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" podStartSLOduration=20.464495095 podStartE2EDuration="29.62249285s" podCreationTimestamp="2026-04-17 17:24:25 +0000 UTC" firstStartedPulling="2026-04-17 17:24:44.175464541 +0000 UTC m=+33.508082165" lastFinishedPulling="2026-04-17 17:24:53.333462295 +0000 UTC m=+42.666079920" observedRunningTime="2026-04-17 17:24:54.598797771 +0000 UTC m=+43.931415419" watchObservedRunningTime="2026-04-17 17:24:54.62249285 +0000 UTC m=+43.955110498" Apr 17 17:24:54.628320 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:54.623132 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kzg6t" podStartSLOduration=38.271276745 podStartE2EDuration="43.623119537s" podCreationTimestamp="2026-04-17 17:24:11 +0000 UTC" firstStartedPulling="2026-04-17 17:24:48.208305888 +0000 UTC m=+37.540923512" lastFinishedPulling="2026-04-17 17:24:53.560148666 +0000 UTC m=+42.892766304" observedRunningTime="2026-04-17 17:24:54.621510677 +0000 UTC m=+43.954128323" watchObservedRunningTime="2026-04-17 17:24:54.623119537 +0000 UTC m=+43.955737183" Apr 17 17:24:55.468567 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:55.468516 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:24:55.469188 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:55.468967 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/0.log" Apr 17 17:24:55.469188 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:55.469000 2565 generic.go:358] "Generic (PLEG): container finished" podID="48466d02-4d96-4b89-b43e-1adc61971469" containerID="363f5b57bdba8356bb4ca2d33f825a6559facfc9f17f2fd0dd0c1b83c36f4de6" exitCode=255 Apr 17 17:24:55.469188 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:55.469080 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" event={"ID":"48466d02-4d96-4b89-b43e-1adc61971469","Type":"ContainerDied","Data":"363f5b57bdba8356bb4ca2d33f825a6559facfc9f17f2fd0dd0c1b83c36f4de6"} Apr 17 17:24:55.469447 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:55.469222 2565 scope.go:117] "RemoveContainer" containerID="1f745da31b96b5cd4e2700d6793b9848b368495acb3eb58a36f709d0587a6e2b" Apr 17 17:24:55.469447 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:55.469386 2565 scope.go:117] "RemoveContainer" containerID="363f5b57bdba8356bb4ca2d33f825a6559facfc9f17f2fd0dd0c1b83c36f4de6" Apr 17 17:24:55.469630 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:55.469564 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-sb45j_openshift-console-operator(48466d02-4d96-4b89-b43e-1adc61971469)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" podUID="48466d02-4d96-4b89-b43e-1adc61971469" Apr 17 17:24:55.473682 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:55.473652 2565 generic.go:358] "Generic (PLEG): container finished" podID="22fe4e50-13fd-4ae5-b9a6-1552184b400f" containerID="cbbe1bc46da9e16ce2d30db0b4202c822db09db716159fa5d1a6e73ad4ce4427" exitCode=0 Apr 17 17:24:55.473820 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:55.473797 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hc4g" event={"ID":"22fe4e50-13fd-4ae5-b9a6-1552184b400f","Type":"ContainerDied","Data":"cbbe1bc46da9e16ce2d30db0b4202c822db09db716159fa5d1a6e73ad4ce4427"} Apr 17 17:24:56.478735 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:56.478709 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:24:56.479293 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:56.479147 2565 scope.go:117] "RemoveContainer" containerID="363f5b57bdba8356bb4ca2d33f825a6559facfc9f17f2fd0dd0c1b83c36f4de6" Apr 17 17:24:56.479406 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:56.479384 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-sb45j_openshift-console-operator(48466d02-4d96-4b89-b43e-1adc61971469)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" podUID="48466d02-4d96-4b89-b43e-1adc61971469" Apr 17 17:24:56.482233 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:56.482205 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2hc4g" event={"ID":"22fe4e50-13fd-4ae5-b9a6-1552184b400f","Type":"ContainerStarted","Data":"155675656edab582741462376762439828d86a4547ac0b204946a36fe70a6009"} Apr 17 17:24:56.524337 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:56.524289 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2hc4g" podStartSLOduration=4.630623033 podStartE2EDuration="45.524275378s" podCreationTimestamp="2026-04-17 17:24:11 +0000 UTC" firstStartedPulling="2026-04-17 17:24:12.437601814 +0000 UTC m=+1.770219438" lastFinishedPulling="2026-04-17 17:24:53.331254143 +0000 UTC m=+42.663871783" observedRunningTime="2026-04-17 17:24:56.523659073 +0000 UTC m=+45.856276720" watchObservedRunningTime="2026-04-17 17:24:56.524275378 +0000 UTC m=+45.856893025" Apr 17 17:24:57.512407 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:57.512381 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-f54jc_1ac6ac62-b297-4aad-a58f-12c981987869/dns-node-resolver/0.log" Apr 17 17:24:58.489550 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:58.489511 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bf9w9" event={"ID":"7b024619-6a7c-49fc-b69c-a21fab2b9f5c","Type":"ContainerStarted","Data":"fbc0e420057c08e3eeacb3bf9031198676036c4c30df48f5eeae8c932c95bc28"} Apr 17 17:24:58.505816 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:58.505768 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bf9w9" podStartSLOduration=39.048812746 podStartE2EDuration="43.505755039s" podCreationTimestamp="2026-04-17 17:24:15 +0000 UTC" firstStartedPulling="2026-04-17 17:24:53.462365308 +0000 UTC m=+42.794982931" lastFinishedPulling="2026-04-17 17:24:57.919307584 +0000 UTC m=+47.251925224" observedRunningTime="2026-04-17 17:24:58.504808101 +0000 UTC m=+47.837425750" watchObservedRunningTime="2026-04-17 17:24:58.505755039 +0000 UTC m=+47.838372675" Apr 17 17:24:58.512069 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:58.512050 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rkk2d_a04f786b-b1fc-4078-9d22-1b263b785992/node-ca/0.log" Apr 17 17:24:59.404216 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:59.404170 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:24:59.404634 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:59.404236 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:24:59.404634 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.404310 2565 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:24:59.404634 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.404354 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:59.404634 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.404371 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-596b4b48f5-ptwxj: secret "image-registry-tls" not found Apr 17 17:24:59.404634 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.404377 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls podName:f65f4a7f-b841-425d-b6cf-02bd9f48fd69 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:15.404361283 +0000 UTC m=+64.736978906 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zl5fr" (UID: "f65f4a7f-b841-425d-b6cf-02bd9f48fd69") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:24:59.404634 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.404433 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls podName:72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:15.404417644 +0000 UTC m=+64.737035267 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls") pod "image-registry-596b4b48f5-ptwxj" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0") : secret "image-registry-tls" not found Apr 17 17:24:59.504670 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:59.504639 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:24:59.504866 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:59.504692 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:59.504866 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:59.504728 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert\") pod \"ingress-canary-r4jgw\" (UID: \"8cd66734-7425-4cd0-99cb-727b3059a1ed\") " pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:24:59.504866 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:59.504754 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:24:59.504866 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.504780 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:59.504866 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.504855 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls podName:40520684-848d-46b0-8288-7c708075eda2 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:15.50482024 +0000 UTC m=+64.837437869 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls") pod "dns-default-zlw54" (UID: "40520684-848d-46b0-8288-7c708075eda2") : secret "dns-default-metrics-tls" not found Apr 17 17:24:59.504866 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.504854 2565 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:24:59.505109 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.504873 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:59.505109 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.504894 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs podName:fba1d80e-239c-4b35-afc0-5052c340a9ee nodeName:}" failed. No retries permitted until 2026-04-17 17:25:15.504883011 +0000 UTC m=+64.837500646 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs") pod "router-default-564dfc4b58-br5kx" (UID: "fba1d80e-239c-4b35-afc0-5052c340a9ee") : secret "router-metrics-certs-default" not found Apr 17 17:24:59.505109 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.504908 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle podName:fba1d80e-239c-4b35-afc0-5052c340a9ee nodeName:}" failed. No retries permitted until 2026-04-17 17:25:15.504902487 +0000 UTC m=+64.837520110 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle") pod "router-default-564dfc4b58-br5kx" (UID: "fba1d80e-239c-4b35-afc0-5052c340a9ee") : configmap references non-existent config key: service-ca.crt Apr 17 17:24:59.505109 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.504921 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 17:24:59.505109 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.504929 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert podName:8cd66734-7425-4cd0-99cb-727b3059a1ed nodeName:}" failed. No retries permitted until 2026-04-17 17:25:15.504913341 +0000 UTC m=+64.837530965 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert") pod "ingress-canary-r4jgw" (UID: "8cd66734-7425-4cd0-99cb-727b3059a1ed") : secret "canary-serving-cert" not found Apr 17 17:24:59.505109 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:59.504879 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h9fr2\" (UID: \"b697600b-a4a4-48b8-b42e-9965d51b283c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:24:59.505109 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.504955 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert podName:b697600b-a4a4-48b8-b42e-9965d51b283c nodeName:}" failed. No retries permitted until 2026-04-17 17:25:15.50494637 +0000 UTC m=+64.837563998 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h9fr2" (UID: "b697600b-a4a4-48b8-b42e-9965d51b283c") : secret "networking-console-plugin-cert" not found Apr 17 17:24:59.505109 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:24:59.504995 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82d56\" (UID: \"d61a14c7-8569-4165-aea3-ca07c214caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:24:59.505109 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.505052 2565 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:24:59.505109 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:24:59.505074 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls podName:d61a14c7-8569-4165-aea3-ca07c214caef nodeName:}" failed. No retries permitted until 2026-04-17 17:25:15.505066756 +0000 UTC m=+64.837684379 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-82d56" (UID: "d61a14c7-8569-4165-aea3-ca07c214caef") : secret "samples-operator-tls" not found Apr 17 17:25:03.952476 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:03.952443 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:25:03.952963 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:03.952529 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:25:03.952963 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:03.952811 2565 scope.go:117] "RemoveContainer" containerID="363f5b57bdba8356bb4ca2d33f825a6559facfc9f17f2fd0dd0c1b83c36f4de6" Apr 17 17:25:03.953032 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:25:03.952987 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-sb45j_openshift-console-operator(48466d02-4d96-4b89-b43e-1adc61971469)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" podUID="48466d02-4d96-4b89-b43e-1adc61971469" Apr 17 17:25:04.505954 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:04.505926 2565 scope.go:117] "RemoveContainer" containerID="363f5b57bdba8356bb4ca2d33f825a6559facfc9f17f2fd0dd0c1b83c36f4de6" Apr 17 17:25:04.506127 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:25:04.506111 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-sb45j_openshift-console-operator(48466d02-4d96-4b89-b43e-1adc61971469)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" podUID="48466d02-4d96-4b89-b43e-1adc61971469" Apr 17 17:25:08.403086 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:08.403059 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4pk44" Apr 17 17:25:15.440150 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.440107 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:25:15.440612 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.440172 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:25:15.442826 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.442798 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls\") pod \"image-registry-596b4b48f5-ptwxj\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:25:15.442945 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.442903 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f65f4a7f-b841-425d-b6cf-02bd9f48fd69-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zl5fr\" (UID: \"f65f4a7f-b841-425d-b6cf-02bd9f48fd69\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:25:15.541564 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.541528 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82d56\" (UID: \"d61a14c7-8569-4165-aea3-ca07c214caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:25:15.541564 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.541572 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:25:15.541823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.541617 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:25:15.541823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.541648 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert\") pod \"ingress-canary-r4jgw\" (UID: \"8cd66734-7425-4cd0-99cb-727b3059a1ed\") " pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:25:15.541823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.541670 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:25:15.541823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.541701 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h9fr2\" (UID: \"b697600b-a4a4-48b8-b42e-9965d51b283c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:25:15.542451 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.542423 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fba1d80e-239c-4b35-afc0-5052c340a9ee-service-ca-bundle\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:25:15.544310 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.544279 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b697600b-a4a4-48b8-b42e-9965d51b283c-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h9fr2\" (UID: \"b697600b-a4a4-48b8-b42e-9965d51b283c\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:25:15.544479 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.544463 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d61a14c7-8569-4165-aea3-ca07c214caef-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82d56\" (UID: \"d61a14c7-8569-4165-aea3-ca07c214caef\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:25:15.544542 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.544482 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba1d80e-239c-4b35-afc0-5052c340a9ee-metrics-certs\") pod \"router-default-564dfc4b58-br5kx\" (UID: \"fba1d80e-239c-4b35-afc0-5052c340a9ee\") " pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:25:15.544762 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.544740 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40520684-848d-46b0-8288-7c708075eda2-metrics-tls\") pod \"dns-default-zlw54\" (UID: \"40520684-848d-46b0-8288-7c708075eda2\") " pod="openshift-dns/dns-default-zlw54" Apr 17 17:25:15.544819 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.544805 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8cd66734-7425-4cd0-99cb-727b3059a1ed-cert\") pod \"ingress-canary-r4jgw\" (UID: \"8cd66734-7425-4cd0-99cb-727b3059a1ed\") " pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:25:15.704097 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.704020 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vv5sf\"" Apr 17 17:25:15.712234 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.712209 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:25:15.714624 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.714606 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hx8hn\"" Apr 17 17:25:15.722686 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.722668 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" Apr 17 17:25:15.732102 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.732084 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-brgkp\"" Apr 17 17:25:15.740081 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.740055 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" Apr 17 17:25:15.746353 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.746334 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-pv52z\"" Apr 17 17:25:15.754809 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.754779 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:25:15.762626 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.762415 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-dxh7z\"" Apr 17 17:25:15.770749 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.770598 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" Apr 17 17:25:15.786915 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.786362 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cl7ph\"" Apr 17 17:25:15.802876 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.791126 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rb846\"" Apr 17 17:25:15.802876 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.791503 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zlw54" Apr 17 17:25:15.802876 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.800196 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r4jgw" Apr 17 17:25:15.846732 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.846276 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:25:15.849298 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.849084 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:25:15.866407 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.866333 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/053402a3-0f05-4423-9697-95ba118cec9c-metrics-certs\") pod \"network-metrics-daemon-clsml\" (UID: \"053402a3-0f05-4423-9697-95ba118cec9c\") " pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:25:15.897104 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.897054 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-596b4b48f5-ptwxj"] Apr 17 17:25:15.908399 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:15.908348 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72bd8fbf_5bf3_44ea_a9b7_8d6863a2a0f0.slice/crio-b1687ded54391324a6ac12a3025290f0f48ee4da3ee1536584bacc65b89b7fe7 WatchSource:0}: Error finding container b1687ded54391324a6ac12a3025290f0f48ee4da3ee1536584bacc65b89b7fe7: Status 404 returned error can't find the container with id b1687ded54391324a6ac12a3025290f0f48ee4da3ee1536584bacc65b89b7fe7 Apr 17 17:25:15.914082 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.910821 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr"] Apr 17 17:25:15.917885 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:15.917571 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf65f4a7f_b841_425d_b6cf_02bd9f48fd69.slice/crio-ba53b6bdf1eb7a6a61271d2c1b6faa199411eab1233e18e346f863936c0894f0 WatchSource:0}: Error finding container ba53b6bdf1eb7a6a61271d2c1b6faa199411eab1233e18e346f863936c0894f0: Status 404 returned error can't find the container with id ba53b6bdf1eb7a6a61271d2c1b6faa199411eab1233e18e346f863936c0894f0 Apr 17 17:25:15.958735 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.957975 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56"] Apr 17 17:25:15.996672 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:15.996527 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-564dfc4b58-br5kx"] Apr 17 17:25:15.999653 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:15.999599 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfba1d80e_239c_4b35_afc0_5052c340a9ee.slice/crio-5943d15509d683f62b0b739062dfb2308622cdaf51428787281647ae988f304c WatchSource:0}: Error finding container 5943d15509d683f62b0b739062dfb2308622cdaf51428787281647ae988f304c: Status 404 returned error can't find the container with id 5943d15509d683f62b0b739062dfb2308622cdaf51428787281647ae988f304c Apr 17 17:25:16.019245 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.017909 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2"] Apr 17 17:25:16.023476 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:16.023450 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb697600b_a4a4_48b8_b42e_9965d51b283c.slice/crio-a7f757f80b708d533f005fc82eaa64a0aa9fb731c53942e8515fdf4c2a378855 WatchSource:0}: Error finding container a7f757f80b708d533f005fc82eaa64a0aa9fb731c53942e8515fdf4c2a378855: Status 404 returned error can't find the container with id a7f757f80b708d533f005fc82eaa64a0aa9fb731c53942e8515fdf4c2a378855 Apr 17 17:25:16.029677 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.029655 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r4jgw"] Apr 17 17:25:16.033129 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:16.033108 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cd66734_7425_4cd0_99cb_727b3059a1ed.slice/crio-9e0e1139eb3b1a0fe2309741938f37a127be0d4cabd9af42463c34823da725b0 WatchSource:0}: Error finding container 9e0e1139eb3b1a0fe2309741938f37a127be0d4cabd9af42463c34823da725b0: Status 404 returned error can't find the container with id 9e0e1139eb3b1a0fe2309741938f37a127be0d4cabd9af42463c34823da725b0 Apr 17 17:25:16.048783 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.048758 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zlw54"] Apr 17 17:25:16.053042 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:16.053009 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40520684_848d_46b0_8288_7c708075eda2.slice/crio-82804ff1a1c62ebe9ee87f51ff572d2313530fbbdfe3b9240e1cccb25ec85541 WatchSource:0}: Error finding container 82804ff1a1c62ebe9ee87f51ff572d2313530fbbdfe3b9240e1cccb25ec85541: Status 404 returned error can't find the container with id 82804ff1a1c62ebe9ee87f51ff572d2313530fbbdfe3b9240e1cccb25ec85541 Apr 17 17:25:16.150721 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.150687 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6nwcp\"" Apr 17 17:25:16.158714 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.158694 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-clsml" Apr 17 17:25:16.281184 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.281154 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-clsml"] Apr 17 17:25:16.284323 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:16.284290 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod053402a3_0f05_4423_9697_95ba118cec9c.slice/crio-75ded454d6814aa8c8adc22b67f6b37518a6eff8a445446ea79568746b075505 WatchSource:0}: Error finding container 75ded454d6814aa8c8adc22b67f6b37518a6eff8a445446ea79568746b075505: Status 404 returned error can't find the container with id 75ded454d6814aa8c8adc22b67f6b37518a6eff8a445446ea79568746b075505 Apr 17 17:25:16.546415 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.546279 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" event={"ID":"f65f4a7f-b841-425d-b6cf-02bd9f48fd69","Type":"ContainerStarted","Data":"ba53b6bdf1eb7a6a61271d2c1b6faa199411eab1233e18e346f863936c0894f0"} Apr 17 17:25:16.549311 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.549281 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r4jgw" event={"ID":"8cd66734-7425-4cd0-99cb-727b3059a1ed","Type":"ContainerStarted","Data":"9e0e1139eb3b1a0fe2309741938f37a127be0d4cabd9af42463c34823da725b0"} Apr 17 17:25:16.551553 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.551271 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" event={"ID":"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0","Type":"ContainerStarted","Data":"56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7"} Apr 17 17:25:16.551553 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.551305 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" event={"ID":"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0","Type":"ContainerStarted","Data":"b1687ded54391324a6ac12a3025290f0f48ee4da3ee1536584bacc65b89b7fe7"} Apr 17 17:25:16.552256 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.552227 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:25:16.553343 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.553320 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-clsml" event={"ID":"053402a3-0f05-4423-9697-95ba118cec9c","Type":"ContainerStarted","Data":"75ded454d6814aa8c8adc22b67f6b37518a6eff8a445446ea79568746b075505"} Apr 17 17:25:16.555180 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.555147 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" event={"ID":"b697600b-a4a4-48b8-b42e-9965d51b283c","Type":"ContainerStarted","Data":"a7f757f80b708d533f005fc82eaa64a0aa9fb731c53942e8515fdf4c2a378855"} Apr 17 17:25:16.558058 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.558033 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-564dfc4b58-br5kx" event={"ID":"fba1d80e-239c-4b35-afc0-5052c340a9ee","Type":"ContainerStarted","Data":"739a6d87c812d258282e6e3251d5264d50974c27b56110d604244f33a9f2ae37"} Apr 17 17:25:16.558196 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.558065 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-564dfc4b58-br5kx" event={"ID":"fba1d80e-239c-4b35-afc0-5052c340a9ee","Type":"ContainerStarted","Data":"5943d15509d683f62b0b739062dfb2308622cdaf51428787281647ae988f304c"} Apr 17 17:25:16.560419 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.560379 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" event={"ID":"d61a14c7-8569-4165-aea3-ca07c214caef","Type":"ContainerStarted","Data":"ae3a18b145198b799377e04e60625af3b3908e662806355c4581551bc0bf2f04"} Apr 17 17:25:16.563030 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.563006 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zlw54" event={"ID":"40520684-848d-46b0-8288-7c708075eda2","Type":"ContainerStarted","Data":"82804ff1a1c62ebe9ee87f51ff572d2313530fbbdfe3b9240e1cccb25ec85541"} Apr 17 17:25:16.584887 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.584121 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" podStartSLOduration=65.584105263 podStartE2EDuration="1m5.584105263s" podCreationTimestamp="2026-04-17 17:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:16.583258281 +0000 UTC m=+65.915875928" watchObservedRunningTime="2026-04-17 17:25:16.584105263 +0000 UTC m=+65.916722910" Apr 17 17:25:16.615831 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.614590 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-564dfc4b58-br5kx" podStartSLOduration=51.61457057 podStartE2EDuration="51.61457057s" podCreationTimestamp="2026-04-17 17:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:16.614559709 +0000 UTC m=+65.947177355" watchObservedRunningTime="2026-04-17 17:25:16.61457057 +0000 UTC m=+65.947188214" Apr 17 17:25:16.755604 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.755568 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:25:16.758858 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:16.758815 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:25:17.193710 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:17.193379 2565 scope.go:117] "RemoveContainer" containerID="363f5b57bdba8356bb4ca2d33f825a6559facfc9f17f2fd0dd0c1b83c36f4de6" Apr 17 17:25:17.567238 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:17.567207 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:25:17.568608 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:17.568575 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-564dfc4b58-br5kx" Apr 17 17:25:20.576221 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.576187 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" event={"ID":"f65f4a7f-b841-425d-b6cf-02bd9f48fd69","Type":"ContainerStarted","Data":"ee9a1685f30d801cb68d75f6a5333c17c69e911a2b001786fdbac7d297a4a178"} Apr 17 17:25:20.577987 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.577960 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:25:20.578117 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.578047 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" event={"ID":"48466d02-4d96-4b89-b43e-1adc61971469","Type":"ContainerStarted","Data":"577414dcfe7a6f6dd3d77491966be1e08c20b95e00a955577dece1d69a382f50"} Apr 17 17:25:20.578374 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.578352 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:25:20.579679 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.579646 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r4jgw" event={"ID":"8cd66734-7425-4cd0-99cb-727b3059a1ed","Type":"ContainerStarted","Data":"cca77b70fd929af378b1040f9bbe2ffb235d1f9aeef78269abe8a2a061d73c51"} Apr 17 17:25:20.581451 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.581429 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-clsml" event={"ID":"053402a3-0f05-4423-9697-95ba118cec9c","Type":"ContainerStarted","Data":"a3a47e60525a862f1afa784b0f31c394f667633ad642dfa785b45e58d9925e94"} Apr 17 17:25:20.581451 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.581453 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-clsml" event={"ID":"053402a3-0f05-4423-9697-95ba118cec9c","Type":"ContainerStarted","Data":"081a39e85fba053545157ac98f2d01097dc96d77367a024e8d120a0bbb960095"} Apr 17 17:25:20.582923 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.582901 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" event={"ID":"b697600b-a4a4-48b8-b42e-9965d51b283c","Type":"ContainerStarted","Data":"c2990dab4a7575d0032a252221d913de006b81e5c6ed6ab4fc0c405ad64966db"} Apr 17 17:25:20.584569 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.584547 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" event={"ID":"d61a14c7-8569-4165-aea3-ca07c214caef","Type":"ContainerStarted","Data":"03c7ebccf86e49392d75ce7d0fe4560988f8f2f9b07e9343fed6eddfc0caa298"} Apr 17 17:25:20.584666 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.584573 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" event={"ID":"d61a14c7-8569-4165-aea3-ca07c214caef","Type":"ContainerStarted","Data":"9a13f71791a7f08f9dcc193450d6a3255c33d5c1a6d343978832121ea9d35da4"} Apr 17 17:25:20.586038 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.586018 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zlw54" event={"ID":"40520684-848d-46b0-8288-7c708075eda2","Type":"ContainerStarted","Data":"9d087d59ffc33bb012a23db242d574db620d8edd77ecf2a42f528085aae63bf8"} Apr 17 17:25:20.586127 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.586043 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zlw54" event={"ID":"40520684-848d-46b0-8288-7c708075eda2","Type":"ContainerStarted","Data":"1df15d3a8a6a455fd81be746f692c1297402c2d27196b4f2e80e9f958dfa85d0"} Apr 17 17:25:20.586238 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.586225 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zlw54" Apr 17 17:25:20.597071 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.597026 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zl5fr" podStartSLOduration=51.627210226 podStartE2EDuration="55.597012093s" podCreationTimestamp="2026-04-17 17:24:25 +0000 UTC" firstStartedPulling="2026-04-17 17:25:15.930862937 +0000 UTC m=+65.263480580" lastFinishedPulling="2026-04-17 17:25:19.900664819 +0000 UTC m=+69.233282447" observedRunningTime="2026-04-17 17:25:20.596541995 +0000 UTC m=+69.929159643" watchObservedRunningTime="2026-04-17 17:25:20.597012093 +0000 UTC m=+69.929629739" Apr 17 17:25:20.615469 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.615408 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" podStartSLOduration=46.442520686 podStartE2EDuration="55.615391797s" podCreationTimestamp="2026-04-17 17:24:25 +0000 UTC" firstStartedPulling="2026-04-17 17:24:44.158580383 +0000 UTC m=+33.491198020" lastFinishedPulling="2026-04-17 17:24:53.331451489 +0000 UTC m=+42.664069131" observedRunningTime="2026-04-17 17:25:20.614246588 +0000 UTC m=+69.946864248" watchObservedRunningTime="2026-04-17 17:25:20.615391797 +0000 UTC m=+69.948009444" Apr 17 17:25:20.634541 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.634498 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zlw54" podStartSLOduration=33.788562607 podStartE2EDuration="37.634481276s" podCreationTimestamp="2026-04-17 17:24:43 +0000 UTC" firstStartedPulling="2026-04-17 17:25:16.055261983 +0000 UTC m=+65.387879607" lastFinishedPulling="2026-04-17 17:25:19.901180647 +0000 UTC m=+69.233798276" observedRunningTime="2026-04-17 17:25:20.633591645 +0000 UTC m=+69.966209291" watchObservedRunningTime="2026-04-17 17:25:20.634481276 +0000 UTC m=+69.967098918" Apr 17 17:25:20.653101 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.653052 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82d56" podStartSLOduration=51.7578949 podStartE2EDuration="55.653039537s" podCreationTimestamp="2026-04-17 17:24:25 +0000 UTC" firstStartedPulling="2026-04-17 17:25:16.006034403 +0000 UTC m=+65.338652027" lastFinishedPulling="2026-04-17 17:25:19.901179031 +0000 UTC m=+69.233796664" observedRunningTime="2026-04-17 17:25:20.652751487 +0000 UTC m=+69.985369133" watchObservedRunningTime="2026-04-17 17:25:20.653039537 +0000 UTC m=+69.985657182" Apr 17 17:25:20.690634 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.690580 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h9fr2" podStartSLOduration=51.815462889 podStartE2EDuration="55.690560626s" podCreationTimestamp="2026-04-17 17:24:25 +0000 UTC" firstStartedPulling="2026-04-17 17:25:16.025483471 +0000 UTC m=+65.358101095" lastFinishedPulling="2026-04-17 17:25:19.900581194 +0000 UTC m=+69.233198832" observedRunningTime="2026-04-17 17:25:20.689277714 +0000 UTC m=+70.021895360" watchObservedRunningTime="2026-04-17 17:25:20.690560626 +0000 UTC m=+70.023178272" Apr 17 17:25:20.730946 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.730886 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r4jgw" podStartSLOduration=33.859105185 podStartE2EDuration="37.730868488s" podCreationTimestamp="2026-04-17 17:24:43 +0000 UTC" firstStartedPulling="2026-04-17 17:25:16.03510917 +0000 UTC m=+65.367726798" lastFinishedPulling="2026-04-17 17:25:19.906872464 +0000 UTC m=+69.239490101" observedRunningTime="2026-04-17 17:25:20.728635578 +0000 UTC m=+70.061253223" watchObservedRunningTime="2026-04-17 17:25:20.730868488 +0000 UTC m=+70.063486136" Apr 17 17:25:20.798041 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.797987 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-clsml" podStartSLOduration=66.183038539 podStartE2EDuration="1m9.797974075s" podCreationTimestamp="2026-04-17 17:24:11 +0000 UTC" firstStartedPulling="2026-04-17 17:25:16.286249487 +0000 UTC m=+65.618867112" lastFinishedPulling="2026-04-17 17:25:19.901185011 +0000 UTC m=+69.233802648" observedRunningTime="2026-04-17 17:25:20.783223782 +0000 UTC m=+70.115841430" watchObservedRunningTime="2026-04-17 17:25:20.797974075 +0000 UTC m=+70.130591719" Apr 17 17:25:20.798532 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.798516 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv"] Apr 17 17:25:20.825046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.825023 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qhbg6"] Apr 17 17:25:20.825183 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.825161 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv" Apr 17 17:25:20.827781 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.827720 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-564kw\"" Apr 17 17:25:20.827963 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.827939 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 17:25:20.843572 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.843548 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv"] Apr 17 17:25:20.843572 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.843574 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qhbg6"] Apr 17 17:25:20.843754 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.843696 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:20.849400 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.849332 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:25:20.849519 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.849486 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:25:20.849574 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.849526 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-42cdb\"" Apr 17 17:25:20.919014 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.918980 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-sb45j" Apr 17 17:25:20.990420 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.990382 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:20.990654 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.990635 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0e9e4663-293c-4ae6-b4a9-c7dd6e747e96-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-74bnv\" (UID: \"0e9e4663-293c-4ae6-b4a9-c7dd6e747e96\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv" Apr 17 17:25:20.990820 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.990791 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-crio-socket\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:20.990967 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.990824 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:20.990967 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.990886 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qms74\" (UniqueName: \"kubernetes.io/projected/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-kube-api-access-qms74\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:20.991090 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:20.990962 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-data-volume\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:21.091461 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.091338 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qms74\" (UniqueName: \"kubernetes.io/projected/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-kube-api-access-qms74\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:21.091461 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.091407 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-data-volume\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:21.091688 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.091479 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:21.091688 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.091509 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0e9e4663-293c-4ae6-b4a9-c7dd6e747e96-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-74bnv\" (UID: \"0e9e4663-293c-4ae6-b4a9-c7dd6e747e96\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv" Apr 17 17:25:21.091688 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.091570 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-crio-socket\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:21.091688 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.091594 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:21.091915 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.091894 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-crio-socket\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:21.092223 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.092175 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-data-volume\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:21.092291 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.092268 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:21.094189 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.094168 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:21.094346 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.094327 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/0e9e4663-293c-4ae6-b4a9-c7dd6e747e96-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-74bnv\" (UID: \"0e9e4663-293c-4ae6-b4a9-c7dd6e747e96\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv" Apr 17 17:25:21.123569 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.123538 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qms74\" (UniqueName: \"kubernetes.io/projected/e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523-kube-api-access-qms74\") pod \"insights-runtime-extractor-qhbg6\" (UID: \"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523\") " pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:21.136494 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.136446 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv" Apr 17 17:25:21.154830 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.154800 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qhbg6" Apr 17 17:25:21.203811 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.202944 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-lnv76"] Apr 17 17:25:21.241390 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.238117 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lnv76"] Apr 17 17:25:21.241390 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.238288 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lnv76" Apr 17 17:25:21.247339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.243257 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-vmzbn\"" Apr 17 17:25:21.247339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.243569 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:25:21.247339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.243767 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:25:21.299772 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.299747 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv"] Apr 17 17:25:21.302378 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:21.302334 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e9e4663_293c_4ae6_b4a9_c7dd6e747e96.slice/crio-0df6ab02c53f43e8140b708560bb2741d3c0d36bfc272bee25eac7c940ea7f6e WatchSource:0}: Error finding container 0df6ab02c53f43e8140b708560bb2741d3c0d36bfc272bee25eac7c940ea7f6e: Status 404 returned error can't find the container with id 0df6ab02c53f43e8140b708560bb2741d3c0d36bfc272bee25eac7c940ea7f6e Apr 17 17:25:21.328315 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.328288 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qhbg6"] Apr 17 17:25:21.331520 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:21.331484 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode04dca9e_5b67_4a1c_9c5a_5a5cd5fad523.slice/crio-1bcf06542c683aef013fe916320bd7158dabe6a9279d3b4d4c3be8fe6b7c6244 WatchSource:0}: Error finding container 1bcf06542c683aef013fe916320bd7158dabe6a9279d3b4d4c3be8fe6b7c6244: Status 404 returned error can't find the container with id 1bcf06542c683aef013fe916320bd7158dabe6a9279d3b4d4c3be8fe6b7c6244 Apr 17 17:25:21.394829 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.394803 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k77cx\" (UniqueName: \"kubernetes.io/projected/66303fe9-2e35-470e-8d83-6c31c7481520-kube-api-access-k77cx\") pod \"downloads-6bcc868b7-lnv76\" (UID: \"66303fe9-2e35-470e-8d83-6c31c7481520\") " pod="openshift-console/downloads-6bcc868b7-lnv76" Apr 17 17:25:21.495258 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.495227 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k77cx\" (UniqueName: \"kubernetes.io/projected/66303fe9-2e35-470e-8d83-6c31c7481520-kube-api-access-k77cx\") pod \"downloads-6bcc868b7-lnv76\" (UID: \"66303fe9-2e35-470e-8d83-6c31c7481520\") " pod="openshift-console/downloads-6bcc868b7-lnv76" Apr 17 17:25:21.505401 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.505378 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k77cx\" (UniqueName: \"kubernetes.io/projected/66303fe9-2e35-470e-8d83-6c31c7481520-kube-api-access-k77cx\") pod \"downloads-6bcc868b7-lnv76\" (UID: \"66303fe9-2e35-470e-8d83-6c31c7481520\") " pod="openshift-console/downloads-6bcc868b7-lnv76" Apr 17 17:25:21.555485 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.555456 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-lnv76" Apr 17 17:25:21.589790 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.589760 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv" event={"ID":"0e9e4663-293c-4ae6-b4a9-c7dd6e747e96","Type":"ContainerStarted","Data":"0df6ab02c53f43e8140b708560bb2741d3c0d36bfc272bee25eac7c940ea7f6e"} Apr 17 17:25:21.591323 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.591294 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qhbg6" event={"ID":"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523","Type":"ContainerStarted","Data":"cb22b76d01609247d10b1b7225a89d5fc070522b2751e8cc3527701464efcf7f"} Apr 17 17:25:21.591323 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.591330 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qhbg6" event={"ID":"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523","Type":"ContainerStarted","Data":"1bcf06542c683aef013fe916320bd7158dabe6a9279d3b4d4c3be8fe6b7c6244"} Apr 17 17:25:21.690953 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:21.690920 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-lnv76"] Apr 17 17:25:21.694828 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:21.694801 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66303fe9_2e35_470e_8d83_6c31c7481520.slice/crio-4ddf28644dc54b9c7008ca137858eca285cb4cacd39220cfc865b42d77f5bd05 WatchSource:0}: Error finding container 4ddf28644dc54b9c7008ca137858eca285cb4cacd39220cfc865b42d77f5bd05: Status 404 returned error can't find the container with id 4ddf28644dc54b9c7008ca137858eca285cb4cacd39220cfc865b42d77f5bd05 Apr 17 17:25:22.594708 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:22.594669 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lnv76" event={"ID":"66303fe9-2e35-470e-8d83-6c31c7481520","Type":"ContainerStarted","Data":"4ddf28644dc54b9c7008ca137858eca285cb4cacd39220cfc865b42d77f5bd05"} Apr 17 17:25:23.600659 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:23.600614 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qhbg6" event={"ID":"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523","Type":"ContainerStarted","Data":"aa140d0bef77ae6967184da67fe9043a08c1681d52922a356f07187a1268a3c1"} Apr 17 17:25:23.602412 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:23.602379 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv" event={"ID":"0e9e4663-293c-4ae6-b4a9-c7dd6e747e96","Type":"ContainerStarted","Data":"08fefcb0680d7555f21da93aa4f56261c6da7ee27617b56476e16109c836921a"} Apr 17 17:25:23.603020 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:23.602984 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv" Apr 17 17:25:23.608762 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:23.608736 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv" Apr 17 17:25:23.622373 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:23.622297 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-74bnv" podStartSLOduration=2.141839562 podStartE2EDuration="3.622277717s" podCreationTimestamp="2026-04-17 17:25:20 +0000 UTC" firstStartedPulling="2026-04-17 17:25:21.304436988 +0000 UTC m=+70.637054611" lastFinishedPulling="2026-04-17 17:25:22.784874923 +0000 UTC m=+72.117492766" observedRunningTime="2026-04-17 17:25:23.619991119 +0000 UTC m=+72.952608767" watchObservedRunningTime="2026-04-17 17:25:23.622277717 +0000 UTC m=+72.954895370" Apr 17 17:25:25.611112 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:25.611058 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qhbg6" event={"ID":"e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523","Type":"ContainerStarted","Data":"87c720766026423dcab7198b6aaa6a9dda7301f00e40f0ce8031b2882df20b6b"} Apr 17 17:25:25.631597 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:25.631546 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qhbg6" podStartSLOduration=2.2846932349999998 podStartE2EDuration="5.631532251s" podCreationTimestamp="2026-04-17 17:25:20 +0000 UTC" firstStartedPulling="2026-04-17 17:25:21.473720134 +0000 UTC m=+70.806337759" lastFinishedPulling="2026-04-17 17:25:24.820559142 +0000 UTC m=+74.153176775" observedRunningTime="2026-04-17 17:25:25.630584619 +0000 UTC m=+74.963202266" watchObservedRunningTime="2026-04-17 17:25:25.631532251 +0000 UTC m=+74.964149896" Apr 17 17:25:26.485699 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:26.485651 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kzg6t" Apr 17 17:25:30.042775 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.042740 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lcv9k"] Apr 17 17:25:30.076479 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.076451 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.079757 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.079720 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:25:30.079906 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.079762 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:25:30.080434 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.080410 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:25:30.080434 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.080427 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-ts64k\"" Apr 17 17:25:30.080595 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.080441 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:25:30.273433 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.273397 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-tls\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.273614 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.273443 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-wtmp\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.273614 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.273492 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f3361750-3e11-42c6-b36a-e43635abdcfe-root\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.273614 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.273528 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zztw\" (UniqueName: \"kubernetes.io/projected/f3361750-3e11-42c6-b36a-e43635abdcfe-kube-api-access-5zztw\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.273614 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.273585 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3361750-3e11-42c6-b36a-e43635abdcfe-metrics-client-ca\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.273769 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.273645 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-accelerators-collector-config\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.273769 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.273690 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3361750-3e11-42c6-b36a-e43635abdcfe-sys\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.273769 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.273707 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-textfile\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.273769 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.273749 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375022 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.374988 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3361750-3e11-42c6-b36a-e43635abdcfe-sys\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375022 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375022 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-textfile\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375279 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375043 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375279 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375072 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-tls\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375279 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375102 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-wtmp\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375279 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375102 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f3361750-3e11-42c6-b36a-e43635abdcfe-sys\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375279 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375134 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f3361750-3e11-42c6-b36a-e43635abdcfe-root\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375279 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375179 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zztw\" (UniqueName: \"kubernetes.io/projected/f3361750-3e11-42c6-b36a-e43635abdcfe-kube-api-access-5zztw\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375279 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375210 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3361750-3e11-42c6-b36a-e43635abdcfe-metrics-client-ca\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375279 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375250 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-accelerators-collector-config\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375279 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375180 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f3361750-3e11-42c6-b36a-e43635abdcfe-root\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375615 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375301 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-wtmp\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375615 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375397 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-textfile\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.375749 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.375729 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-accelerators-collector-config\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.377463 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.377434 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.377463 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.377442 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f3361750-3e11-42c6-b36a-e43635abdcfe-node-exporter-tls\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.384670 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.384651 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zztw\" (UniqueName: \"kubernetes.io/projected/f3361750-3e11-42c6-b36a-e43635abdcfe-kube-api-access-5zztw\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.386281 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.386254 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3361750-3e11-42c6-b36a-e43635abdcfe-metrics-client-ca\") pod \"node-exporter-lcv9k\" (UID: \"f3361750-3e11-42c6-b36a-e43635abdcfe\") " pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.387573 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.387558 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lcv9k" Apr 17 17:25:30.397913 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:30.397887 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3361750_3e11_42c6_b36a_e43635abdcfe.slice/crio-10509a06ce90798c9ad83b137a7f119579d2ad5bd377bef5d46abac165db7806 WatchSource:0}: Error finding container 10509a06ce90798c9ad83b137a7f119579d2ad5bd377bef5d46abac165db7806: Status 404 returned error can't find the container with id 10509a06ce90798c9ad83b137a7f119579d2ad5bd377bef5d46abac165db7806 Apr 17 17:25:30.595024 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.594980 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zlw54" Apr 17 17:25:30.628586 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:30.628488 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lcv9k" event={"ID":"f3361750-3e11-42c6-b36a-e43635abdcfe","Type":"ContainerStarted","Data":"10509a06ce90798c9ad83b137a7f119579d2ad5bd377bef5d46abac165db7806"} Apr 17 17:25:31.037868 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.037781 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:25:31.055452 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.055217 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.058060 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.057510 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:25:31.058060 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.057979 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 17:25:31.058454 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.058433 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 17:25:31.061822 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.061800 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 17:25:31.062146 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.062126 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 17:25:31.062328 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.062312 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 17:25:31.064923 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.064903 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-cp7h2\"" Apr 17 17:25:31.065215 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.065197 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 17:25:31.065394 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.065379 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 17:25:31.065548 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.065528 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 17:25:31.065746 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.065675 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 17:25:31.182331 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182178 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d34029b6-2a7d-4953-95a9-e7402caac075-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.182331 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182223 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-config-volume\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.182331 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182243 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d34029b6-2a7d-4953-95a9-e7402caac075-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.182331 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182269 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-web-config\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.182331 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182325 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d34029b6-2a7d-4953-95a9-e7402caac075-config-out\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.182331 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182342 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.182709 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182378 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.182709 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182415 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.182709 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182432 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4frc\" (UniqueName: \"kubernetes.io/projected/d34029b6-2a7d-4953-95a9-e7402caac075-kube-api-access-h4frc\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.182709 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182463 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.182709 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182482 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d34029b6-2a7d-4953-95a9-e7402caac075-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.182709 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182507 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.182709 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.182523 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34029b6-2a7d-4953-95a9-e7402caac075-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.283681 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.283647 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d34029b6-2a7d-4953-95a9-e7402caac075-config-out\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.283681 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.283693 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.283957 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.283745 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.283957 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.283793 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.283957 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.283820 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4frc\" (UniqueName: \"kubernetes.io/projected/d34029b6-2a7d-4953-95a9-e7402caac075-kube-api-access-h4frc\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.283957 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.283883 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.283957 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.283908 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d34029b6-2a7d-4953-95a9-e7402caac075-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.283957 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.283950 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.284242 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.283978 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34029b6-2a7d-4953-95a9-e7402caac075-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.284242 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.284009 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d34029b6-2a7d-4953-95a9-e7402caac075-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.284242 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.284057 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-config-volume\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.284242 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.284080 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d34029b6-2a7d-4953-95a9-e7402caac075-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.284242 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.284110 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-web-config\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.288051 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.287741 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.289339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.288706 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.289339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.288734 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d34029b6-2a7d-4953-95a9-e7402caac075-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.289339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.288824 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34029b6-2a7d-4953-95a9-e7402caac075-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.290230 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.289715 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d34029b6-2a7d-4953-95a9-e7402caac075-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.290230 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.290173 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d34029b6-2a7d-4953-95a9-e7402caac075-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.290582 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.290564 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d34029b6-2a7d-4953-95a9-e7402caac075-config-out\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.293572 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.293548 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-web-config\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.297131 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.297104 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.297236 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.297158 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.299133 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.299111 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.299664 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.299597 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-config-volume\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.307663 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.307636 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4frc\" (UniqueName: \"kubernetes.io/projected/d34029b6-2a7d-4953-95a9-e7402caac075-kube-api-access-h4frc\") pod \"alertmanager-main-0\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.374585 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.374112 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:25:31.625881 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:31.625826 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:25:31.689630 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:31.689576 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd34029b6_2a7d_4953_95a9_e7402caac075.slice/crio-e1b136d304ebc757b68ca0cceb6a970ee1c49812a28f94f16466e9c29787bfbd WatchSource:0}: Error finding container e1b136d304ebc757b68ca0cceb6a970ee1c49812a28f94f16466e9c29787bfbd: Status 404 returned error can't find the container with id e1b136d304ebc757b68ca0cceb6a970ee1c49812a28f94f16466e9c29787bfbd Apr 17 17:25:32.068486 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.068455 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-cf98947dc-vk8tl"] Apr 17 17:25:32.072429 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.072411 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.075308 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.075188 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-f7r1n520nf6sk\"" Apr 17 17:25:32.075308 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.075226 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-vvv49\"" Apr 17 17:25:32.075308 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.075268 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 17:25:32.075567 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.075353 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 17:25:32.075567 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.075374 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 17:25:32.075567 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.075399 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 17:25:32.075567 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.075373 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 17:25:32.083417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.083393 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-cf98947dc-vk8tl"] Apr 17 17:25:32.092013 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.091801 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.092013 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.091875 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.092013 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.091902 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr69j\" (UniqueName: \"kubernetes.io/projected/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-kube-api-access-vr69j\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.092013 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.091965 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.092285 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.092027 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.092285 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.092052 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-metrics-client-ca\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.092285 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.092078 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-tls\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.092285 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.092145 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-grpc-tls\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.192681 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.192579 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.192883 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.192759 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.192883 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.192798 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-metrics-client-ca\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.192883 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.192820 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-tls\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.192883 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.192873 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-grpc-tls\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.193108 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.192907 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.193108 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.192953 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.193237 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.193209 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr69j\" (UniqueName: \"kubernetes.io/projected/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-kube-api-access-vr69j\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.194514 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.194290 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-metrics-client-ca\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.197195 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.196647 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-tls\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.197195 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.196898 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.197195 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.197153 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-grpc-tls\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.197859 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.197803 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.198334 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.198307 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.198951 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.198930 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.203125 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.203080 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr69j\" (UniqueName: \"kubernetes.io/projected/3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd-kube-api-access-vr69j\") pod \"thanos-querier-cf98947dc-vk8tl\" (UID: \"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd\") " pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.383167 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.383064 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:32.636577 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.636492 2565 generic.go:358] "Generic (PLEG): container finished" podID="f3361750-3e11-42c6-b36a-e43635abdcfe" containerID="8a1afec3725c8cffd318d180f4e8298eb6204e1f1737b37c72c81bd7d4171a89" exitCode=0 Apr 17 17:25:32.636766 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.636585 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lcv9k" event={"ID":"f3361750-3e11-42c6-b36a-e43635abdcfe","Type":"ContainerDied","Data":"8a1afec3725c8cffd318d180f4e8298eb6204e1f1737b37c72c81bd7d4171a89"} Apr 17 17:25:32.637988 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:32.637958 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerStarted","Data":"e1b136d304ebc757b68ca0cceb6a970ee1c49812a28f94f16466e9c29787bfbd"} Apr 17 17:25:34.277011 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.276962 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm"] Apr 17 17:25:34.281080 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.281050 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.284334 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.284306 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-ghd9p\"" Apr 17 17:25:34.285148 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.285113 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-2atv4cru3jkmc\"" Apr 17 17:25:34.285631 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.285599 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 17:25:34.286441 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.286179 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:25:34.286441 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.286242 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 17:25:34.286441 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.286329 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 17:25:34.295976 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.295949 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm"] Apr 17 17:25:34.307671 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.307641 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9019f918-5ea9-4140-aa80-f6a686c46531-secret-metrics-server-client-certs\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.307913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.307885 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9019f918-5ea9-4140-aa80-f6a686c46531-metrics-server-audit-profiles\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.308023 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.307933 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9019f918-5ea9-4140-aa80-f6a686c46531-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.308023 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.308001 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9019f918-5ea9-4140-aa80-f6a686c46531-audit-log\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.308156 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.308065 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9019f918-5ea9-4140-aa80-f6a686c46531-client-ca-bundle\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.308156 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.308104 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9019f918-5ea9-4140-aa80-f6a686c46531-secret-metrics-server-tls\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.308156 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.308139 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p47c\" (UniqueName: \"kubernetes.io/projected/9019f918-5ea9-4140-aa80-f6a686c46531-kube-api-access-9p47c\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.408599 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.408560 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9019f918-5ea9-4140-aa80-f6a686c46531-client-ca-bundle\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.408783 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.408611 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9019f918-5ea9-4140-aa80-f6a686c46531-secret-metrics-server-tls\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.408783 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.408642 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9p47c\" (UniqueName: \"kubernetes.io/projected/9019f918-5ea9-4140-aa80-f6a686c46531-kube-api-access-9p47c\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.408783 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.408683 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9019f918-5ea9-4140-aa80-f6a686c46531-secret-metrics-server-client-certs\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.408783 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.408734 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9019f918-5ea9-4140-aa80-f6a686c46531-metrics-server-audit-profiles\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.408783 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.408768 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9019f918-5ea9-4140-aa80-f6a686c46531-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.409072 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.408807 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9019f918-5ea9-4140-aa80-f6a686c46531-audit-log\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.409298 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.409210 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9019f918-5ea9-4140-aa80-f6a686c46531-audit-log\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.409693 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.409669 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9019f918-5ea9-4140-aa80-f6a686c46531-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.410318 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.410284 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9019f918-5ea9-4140-aa80-f6a686c46531-metrics-server-audit-profiles\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.411544 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.411523 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9019f918-5ea9-4140-aa80-f6a686c46531-secret-metrics-server-tls\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.411804 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.411758 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9019f918-5ea9-4140-aa80-f6a686c46531-secret-metrics-server-client-certs\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.417831 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.417807 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p47c\" (UniqueName: \"kubernetes.io/projected/9019f918-5ea9-4140-aa80-f6a686c46531-kube-api-access-9p47c\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.424181 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.424155 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9019f918-5ea9-4140-aa80-f6a686c46531-client-ca-bundle\") pod \"metrics-server-8b7cf6bd8-jjfpm\" (UID: \"9019f918-5ea9-4140-aa80-f6a686c46531\") " pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:34.596590 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:34.596550 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:36.469081 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.469048 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b4868b6cb-hh4p7"] Apr 17 17:25:36.474136 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.474103 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.478751 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.478728 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:25:36.479737 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.479717 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:25:36.479873 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.479777 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:25:36.481197 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.481082 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:25:36.481420 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.481398 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:25:36.481538 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.481475 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-p4d47\"" Apr 17 17:25:36.485177 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.485157 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:25:36.487490 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.487470 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b4868b6cb-hh4p7"] Apr 17 17:25:36.523429 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.523399 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-serving-cert\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.523592 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.523434 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-oauth-serving-cert\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.523592 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.523508 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-service-ca\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.523592 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.523560 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-config\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.523592 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.523589 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-oauth-config\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.523817 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.523620 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv85w\" (UniqueName: \"kubernetes.io/projected/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-kube-api-access-vv85w\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.523817 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.523675 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-trusted-ca-bundle\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.624723 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.624689 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-serving-cert\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.624912 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.624738 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-oauth-serving-cert\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.624912 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.624788 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-service-ca\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.624912 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.624818 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-config\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.625205 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.625178 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-oauth-config\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.625331 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.625233 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv85w\" (UniqueName: \"kubernetes.io/projected/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-kube-api-access-vv85w\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.625331 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.625286 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-trusted-ca-bundle\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.625580 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.625554 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-config\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.625652 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.625593 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-oauth-serving-cert\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.625774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.625722 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-service-ca\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.626482 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.626461 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-trusted-ca-bundle\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.627834 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.627814 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-serving-cert\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.628087 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.628067 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-oauth-config\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.635351 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.635326 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv85w\" (UniqueName: \"kubernetes.io/projected/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-kube-api-access-vv85w\") pod \"console-5b4868b6cb-hh4p7\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:36.786801 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:36.786710 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:38.573871 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:38.573827 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:25:39.278187 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.278142 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-cf98947dc-vk8tl"] Apr 17 17:25:39.279996 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:39.279971 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fbf76fd_2efd_4c3f_9e4d_de9aa98758dd.slice/crio-60c797d70279e6dee8ffa79e785cbc681935647ba8656722ba4d6cf0147d55f1 WatchSource:0}: Error finding container 60c797d70279e6dee8ffa79e785cbc681935647ba8656722ba4d6cf0147d55f1: Status 404 returned error can't find the container with id 60c797d70279e6dee8ffa79e785cbc681935647ba8656722ba4d6cf0147d55f1 Apr 17 17:25:39.283213 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.283173 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b4868b6cb-hh4p7"] Apr 17 17:25:39.288509 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:39.288471 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ef0b39b_a2e0_4990_aa29_abeff0c4301b.slice/crio-3600fea5d7482c8f6178f018ad9c12e607b0ee4116e8133e033fb8728a07fbd4 WatchSource:0}: Error finding container 3600fea5d7482c8f6178f018ad9c12e607b0ee4116e8133e033fb8728a07fbd4: Status 404 returned error can't find the container with id 3600fea5d7482c8f6178f018ad9c12e607b0ee4116e8133e033fb8728a07fbd4 Apr 17 17:25:39.298908 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.298883 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm"] Apr 17 17:25:39.301022 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:25:39.300989 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9019f918_5ea9_4140_aa80_f6a686c46531.slice/crio-1c9a99965667cd60254b31eadfdca840f2a4c609052658beb6d5701d450d4ad0 WatchSource:0}: Error finding container 1c9a99965667cd60254b31eadfdca840f2a4c609052658beb6d5701d450d4ad0: Status 404 returned error can't find the container with id 1c9a99965667cd60254b31eadfdca840f2a4c609052658beb6d5701d450d4ad0 Apr 17 17:25:39.660917 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.660878 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-lnv76" event={"ID":"66303fe9-2e35-470e-8d83-6c31c7481520","Type":"ContainerStarted","Data":"1d2c82d0f9b6b498e81a11e165f7a2d5ccaa4934f4ed45efaed549fec688f2d9"} Apr 17 17:25:39.661350 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.661057 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-lnv76" Apr 17 17:25:39.662195 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.662141 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b4868b6cb-hh4p7" event={"ID":"1ef0b39b-a2e0-4990-aa29-abeff0c4301b","Type":"ContainerStarted","Data":"3600fea5d7482c8f6178f018ad9c12e607b0ee4116e8133e033fb8728a07fbd4"} Apr 17 17:25:39.664406 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.664379 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lcv9k" event={"ID":"f3361750-3e11-42c6-b36a-e43635abdcfe","Type":"ContainerStarted","Data":"1dd0180265b31545df8add6a5ba43355af4f03f8d5d2fd8c623933acf1401066"} Apr 17 17:25:39.664406 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.664411 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lcv9k" event={"ID":"f3361750-3e11-42c6-b36a-e43635abdcfe","Type":"ContainerStarted","Data":"7447f62710cbe8b53a9f72046ca9047e28317082af80d53544e0405023d78f8e"} Apr 17 17:25:39.665896 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.665870 2565 generic.go:358] "Generic (PLEG): container finished" podID="d34029b6-2a7d-4953-95a9-e7402caac075" containerID="1a82883bc13ebe9a2f6429e317e2f289c1382016cb53ee673ec579c6fcca8295" exitCode=0 Apr 17 17:25:39.666002 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.665952 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerDied","Data":"1a82883bc13ebe9a2f6429e317e2f289c1382016cb53ee673ec579c6fcca8295"} Apr 17 17:25:39.667215 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.667190 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" event={"ID":"9019f918-5ea9-4140-aa80-f6a686c46531","Type":"ContainerStarted","Data":"1c9a99965667cd60254b31eadfdca840f2a4c609052658beb6d5701d450d4ad0"} Apr 17 17:25:39.668377 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.668355 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" event={"ID":"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd","Type":"ContainerStarted","Data":"60c797d70279e6dee8ffa79e785cbc681935647ba8656722ba4d6cf0147d55f1"} Apr 17 17:25:39.673635 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.673616 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-lnv76" Apr 17 17:25:39.682596 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.682545 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-lnv76" podStartSLOduration=1.23364875 podStartE2EDuration="18.682531395s" podCreationTimestamp="2026-04-17 17:25:21 +0000 UTC" firstStartedPulling="2026-04-17 17:25:21.697006982 +0000 UTC m=+71.029624605" lastFinishedPulling="2026-04-17 17:25:39.145889617 +0000 UTC m=+88.478507250" observedRunningTime="2026-04-17 17:25:39.681233698 +0000 UTC m=+89.013851355" watchObservedRunningTime="2026-04-17 17:25:39.682531395 +0000 UTC m=+89.015149031" Apr 17 17:25:39.752713 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:39.752660 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lcv9k" podStartSLOduration=8.450976784 podStartE2EDuration="9.75264029s" podCreationTimestamp="2026-04-17 17:25:30 +0000 UTC" firstStartedPulling="2026-04-17 17:25:30.399652663 +0000 UTC m=+79.732270306" lastFinishedPulling="2026-04-17 17:25:31.701316188 +0000 UTC m=+81.033933812" observedRunningTime="2026-04-17 17:25:39.75235178 +0000 UTC m=+89.084969426" watchObservedRunningTime="2026-04-17 17:25:39.75264029 +0000 UTC m=+89.085257936" Apr 17 17:25:43.309088 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:43.309053 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-596b4b48f5-ptwxj"] Apr 17 17:25:45.691312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:45.691271 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" event={"ID":"9019f918-5ea9-4140-aa80-f6a686c46531","Type":"ContainerStarted","Data":"d0ebfdfe7ef1a955c8f30f44ca0187f93d16d9229f19fc486cbc269864e00625"} Apr 17 17:25:45.693658 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:45.693629 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" event={"ID":"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd","Type":"ContainerStarted","Data":"efa0c8180922efca5e14e8efa82b31a3d9006d035dec2cf083d6bce9571411e0"} Apr 17 17:25:45.693770 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:45.693667 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" event={"ID":"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd","Type":"ContainerStarted","Data":"af4cc2f271beb9368655614da09d0818d43146dddafc72c255c3a0ee406ef766"} Apr 17 17:25:45.693770 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:45.693681 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" event={"ID":"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd","Type":"ContainerStarted","Data":"91dfdea931604713f3bda47722fc26cbdffce9a93d922686593ad35409c24d93"} Apr 17 17:25:45.696012 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:45.695981 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b4868b6cb-hh4p7" event={"ID":"1ef0b39b-a2e0-4990-aa29-abeff0c4301b","Type":"ContainerStarted","Data":"de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943"} Apr 17 17:25:45.699534 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:45.699327 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerStarted","Data":"7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad"} Apr 17 17:25:45.699534 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:45.699357 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerStarted","Data":"8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89"} Apr 17 17:25:45.699534 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:45.699371 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerStarted","Data":"b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9"} Apr 17 17:25:45.699534 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:45.699395 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerStarted","Data":"b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c"} Apr 17 17:25:45.711070 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:45.711018 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" podStartSLOduration=5.934181109 podStartE2EDuration="11.711005389s" podCreationTimestamp="2026-04-17 17:25:34 +0000 UTC" firstStartedPulling="2026-04-17 17:25:39.302790177 +0000 UTC m=+88.635407802" lastFinishedPulling="2026-04-17 17:25:45.07961445 +0000 UTC m=+94.412232082" observedRunningTime="2026-04-17 17:25:45.708955972 +0000 UTC m=+95.041573619" watchObservedRunningTime="2026-04-17 17:25:45.711005389 +0000 UTC m=+95.043623034" Apr 17 17:25:45.726427 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:45.726343 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b4868b6cb-hh4p7" podStartSLOduration=3.929565469 podStartE2EDuration="9.726328053s" podCreationTimestamp="2026-04-17 17:25:36 +0000 UTC" firstStartedPulling="2026-04-17 17:25:39.291059708 +0000 UTC m=+88.623677332" lastFinishedPulling="2026-04-17 17:25:45.087822277 +0000 UTC m=+94.420439916" observedRunningTime="2026-04-17 17:25:45.725489128 +0000 UTC m=+95.058106773" watchObservedRunningTime="2026-04-17 17:25:45.726328053 +0000 UTC m=+95.058945699" Apr 17 17:25:46.708898 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:46.708832 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerStarted","Data":"07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2"} Apr 17 17:25:46.787013 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:46.786976 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:46.787186 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:46.787024 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:46.792348 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:46.792320 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:47.716776 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:47.716731 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" event={"ID":"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd","Type":"ContainerStarted","Data":"141648fd2e8882c22034f359398e2efc262368b027fe17577d0fa6e04232db93"} Apr 17 17:25:47.716776 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:47.716780 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" event={"ID":"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd","Type":"ContainerStarted","Data":"dd0bdfe8f2829dfe94f52a5289f1b06f969b3598bf3c825186b13eb632c3fa8c"} Apr 17 17:25:47.717293 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:47.716794 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" event={"ID":"3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd","Type":"ContainerStarted","Data":"ff067f784c231075b45e47fc07b30fdb00ca46976d13e258fe41450959a51069"} Apr 17 17:25:47.720399 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:47.720368 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerStarted","Data":"08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0"} Apr 17 17:25:47.724771 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:47.724745 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:25:47.741097 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:47.741036 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" podStartSLOduration=7.849104899 podStartE2EDuration="15.741017363s" podCreationTimestamp="2026-04-17 17:25:32 +0000 UTC" firstStartedPulling="2026-04-17 17:25:39.282798346 +0000 UTC m=+88.615415973" lastFinishedPulling="2026-04-17 17:25:47.174710806 +0000 UTC m=+96.507328437" observedRunningTime="2026-04-17 17:25:47.738959591 +0000 UTC m=+97.071577240" watchObservedRunningTime="2026-04-17 17:25:47.741017363 +0000 UTC m=+97.073634986" Apr 17 17:25:47.772412 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:47.772357 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.452611968 podStartE2EDuration="16.772319456s" podCreationTimestamp="2026-04-17 17:25:31 +0000 UTC" firstStartedPulling="2026-04-17 17:25:31.697695188 +0000 UTC m=+81.030312817" lastFinishedPulling="2026-04-17 17:25:47.017402669 +0000 UTC m=+96.350020305" observedRunningTime="2026-04-17 17:25:47.770727733 +0000 UTC m=+97.103345416" watchObservedRunningTime="2026-04-17 17:25:47.772319456 +0000 UTC m=+97.104937101" Apr 17 17:25:48.726919 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:48.726880 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:54.597357 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:54.597315 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:54.597357 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:54.597357 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:25:54.736000 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:54.735968 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-cf98947dc-vk8tl" Apr 17 17:25:59.765128 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:59.765090 2565 generic.go:358] "Generic (PLEG): container finished" podID="30189955-9e22-4280-bbeb-b99c4bea9d98" containerID="c36cbfb3473e69e3298fd8f96dad1379c86131cdb73baf6b44d9c42551721689" exitCode=0 Apr 17 17:25:59.765518 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:59.765153 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" event={"ID":"30189955-9e22-4280-bbeb-b99c4bea9d98","Type":"ContainerDied","Data":"c36cbfb3473e69e3298fd8f96dad1379c86131cdb73baf6b44d9c42551721689"} Apr 17 17:25:59.765518 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:25:59.765487 2565 scope.go:117] "RemoveContainer" containerID="c36cbfb3473e69e3298fd8f96dad1379c86131cdb73baf6b44d9c42551721689" Apr 17 17:26:00.769714 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:00.769675 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gq4sp" event={"ID":"30189955-9e22-4280-bbeb-b99c4bea9d98","Type":"ContainerStarted","Data":"9a01c780af8588667d3dcf79c088849b9a45a68333de842de8ec1262bdc102c2"} Apr 17 17:26:08.331190 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.331130 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" podUID="72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" containerName="registry" containerID="cri-o://56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7" gracePeriod=30 Apr 17 17:26:08.568629 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.568606 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:26:08.730170 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.730076 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-trusted-ca\") pod \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " Apr 17 17:26:08.730170 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.730130 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-ca-trust-extracted\") pod \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " Apr 17 17:26:08.730170 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.730165 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-bound-sa-token\") pod \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " Apr 17 17:26:08.730472 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.730225 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-installation-pull-secrets\") pod \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " Apr 17 17:26:08.730472 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.730251 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-image-registry-private-configuration\") pod \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " Apr 17 17:26:08.730472 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.730278 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-certificates\") pod \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " Apr 17 17:26:08.730472 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.730312 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x79z\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-kube-api-access-2x79z\") pod \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " Apr 17 17:26:08.730472 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.730344 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls\") pod \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\" (UID: \"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0\") " Apr 17 17:26:08.731476 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.731437 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:08.731603 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.731518 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:08.733413 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.733377 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:08.733525 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.733482 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:08.733525 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.733487 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-kube-api-access-2x79z" (OuterVolumeSpecName: "kube-api-access-2x79z") pod "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0"). InnerVolumeSpecName "kube-api-access-2x79z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:08.733614 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.733536 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:08.733649 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.733619 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:08.738910 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.738880 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" (UID: "72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:26:08.796173 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.796139 2565 generic.go:358] "Generic (PLEG): container finished" podID="72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" containerID="56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7" exitCode=0 Apr 17 17:26:08.796332 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.796199 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" Apr 17 17:26:08.796332 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.796211 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" event={"ID":"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0","Type":"ContainerDied","Data":"56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7"} Apr 17 17:26:08.796332 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.796247 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-596b4b48f5-ptwxj" event={"ID":"72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0","Type":"ContainerDied","Data":"b1687ded54391324a6ac12a3025290f0f48ee4da3ee1536584bacc65b89b7fe7"} Apr 17 17:26:08.796332 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.796264 2565 scope.go:117] "RemoveContainer" containerID="56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7" Apr 17 17:26:08.808463 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.808435 2565 scope.go:117] "RemoveContainer" containerID="56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7" Apr 17 17:26:08.808710 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:26:08.808690 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7\": container with ID starting with 56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7 not found: ID does not exist" containerID="56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7" Apr 17 17:26:08.808754 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.808717 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7"} err="failed to get container status \"56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7\": rpc error: code = NotFound desc = could not find container \"56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7\": container with ID starting with 56f04d236a1c648b273ee130b12f08ba05faeb4037e085b1f38c0797c4c30ab7 not found: ID does not exist" Apr 17 17:26:08.820433 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.820413 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-596b4b48f5-ptwxj"] Apr 17 17:26:08.824437 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.824419 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-596b4b48f5-ptwxj"] Apr 17 17:26:08.832342 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.832324 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-trusted-ca\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:08.832401 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.832346 2565 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-ca-trust-extracted\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:08.832401 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.832355 2565 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-bound-sa-token\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:08.832401 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.832364 2565 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-installation-pull-secrets\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:08.832401 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.832374 2565 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-image-registry-private-configuration\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:08.832401 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.832383 2565 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-certificates\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:08.832401 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.832392 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2x79z\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-kube-api-access-2x79z\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:08.832401 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:08.832400 2565 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0-registry-tls\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:09.193226 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:09.193190 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" path="/var/lib/kubelet/pods/72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0/volumes" Apr 17 17:26:09.800795 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:09.800770 2565 generic.go:358] "Generic (PLEG): container finished" podID="b20915a3-9438-4b66-b1a3-83c753d5524b" containerID="40cd339ef8a37f399a50070729177724abcc600c6d34a6e45a6795314c768463" exitCode=0 Apr 17 17:26:09.801280 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:09.800857 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" event={"ID":"b20915a3-9438-4b66-b1a3-83c753d5524b","Type":"ContainerDied","Data":"40cd339ef8a37f399a50070729177724abcc600c6d34a6e45a6795314c768463"} Apr 17 17:26:09.801280 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:09.801259 2565 scope.go:117] "RemoveContainer" containerID="40cd339ef8a37f399a50070729177724abcc600c6d34a6e45a6795314c768463" Apr 17 17:26:10.806377 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:10.806335 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qqbrq" event={"ID":"b20915a3-9438-4b66-b1a3-83c753d5524b","Type":"ContainerStarted","Data":"a07851d3c771af4d4572e73032d0eca3fe4c79c755220da3ffea5d4329aa7678"} Apr 17 17:26:14.602757 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:14.602669 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:26:14.607033 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:14.607012 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8b7cf6bd8-jjfpm" Apr 17 17:26:14.819818 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:14.819786 2565 generic.go:358] "Generic (PLEG): container finished" podID="f175c7b4-7e01-4c45-b098-27c87d4ba139" containerID="6fe59ef59a0e3e1aa8baf6ec8bd9f57c2f459c9b62bbb279974f1432c0228b3e" exitCode=0 Apr 17 17:26:14.819994 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:14.819870 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mh4jk" event={"ID":"f175c7b4-7e01-4c45-b098-27c87d4ba139","Type":"ContainerDied","Data":"6fe59ef59a0e3e1aa8baf6ec8bd9f57c2f459c9b62bbb279974f1432c0228b3e"} Apr 17 17:26:14.820315 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:14.820299 2565 scope.go:117] "RemoveContainer" containerID="6fe59ef59a0e3e1aa8baf6ec8bd9f57c2f459c9b62bbb279974f1432c0228b3e" Apr 17 17:26:15.823967 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:15.823935 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mh4jk" event={"ID":"f175c7b4-7e01-4c45-b098-27c87d4ba139","Type":"ContainerStarted","Data":"e9ce26872c9e797d13966acb16920df386ad083f52083478b60a3c9d4ed36592"} Apr 17 17:26:50.276059 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.276018 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:26:50.276506 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.276432 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="alertmanager" containerID="cri-o://b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c" gracePeriod=120 Apr 17 17:26:50.276563 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.276500 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="kube-rbac-proxy-metric" containerID="cri-o://07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2" gracePeriod=120 Apr 17 17:26:50.276563 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.276534 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="kube-rbac-proxy-web" containerID="cri-o://8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89" gracePeriod=120 Apr 17 17:26:50.276655 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.276588 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="prom-label-proxy" containerID="cri-o://08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0" gracePeriod=120 Apr 17 17:26:50.276655 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.276547 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="kube-rbac-proxy" containerID="cri-o://7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad" gracePeriod=120 Apr 17 17:26:50.276772 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.276593 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="config-reloader" containerID="cri-o://b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9" gracePeriod=120 Apr 17 17:26:50.938571 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.938539 2565 generic.go:358] "Generic (PLEG): container finished" podID="d34029b6-2a7d-4953-95a9-e7402caac075" containerID="08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0" exitCode=0 Apr 17 17:26:50.938571 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.938566 2565 generic.go:358] "Generic (PLEG): container finished" podID="d34029b6-2a7d-4953-95a9-e7402caac075" containerID="07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2" exitCode=0 Apr 17 17:26:50.938571 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.938572 2565 generic.go:358] "Generic (PLEG): container finished" podID="d34029b6-2a7d-4953-95a9-e7402caac075" containerID="7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad" exitCode=0 Apr 17 17:26:50.938571 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.938578 2565 generic.go:358] "Generic (PLEG): container finished" podID="d34029b6-2a7d-4953-95a9-e7402caac075" containerID="b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9" exitCode=0 Apr 17 17:26:50.938831 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.938584 2565 generic.go:358] "Generic (PLEG): container finished" podID="d34029b6-2a7d-4953-95a9-e7402caac075" containerID="b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c" exitCode=0 Apr 17 17:26:50.938831 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.938608 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerDied","Data":"08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0"} Apr 17 17:26:50.938831 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.938637 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerDied","Data":"07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2"} Apr 17 17:26:50.938831 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.938647 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerDied","Data":"7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad"} Apr 17 17:26:50.938831 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.938656 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerDied","Data":"b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9"} Apr 17 17:26:50.938831 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:50.938665 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerDied","Data":"b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c"} Apr 17 17:26:51.535202 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.535176 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:51.716897 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.716770 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-web-config\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.716897 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.716833 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d34029b6-2a7d-4953-95a9-e7402caac075-config-out\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.716897 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.716874 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-cluster-tls-config\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.716897 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.716892 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d34029b6-2a7d-4953-95a9-e7402caac075-tls-assets\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.717248 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.716924 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.717248 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.716951 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34029b6-2a7d-4953-95a9-e7402caac075-alertmanager-trusted-ca-bundle\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.717248 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.716978 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-config-volume\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.717248 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.717129 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d34029b6-2a7d-4953-95a9-e7402caac075-alertmanager-main-db\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.717248 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.717171 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d34029b6-2a7d-4953-95a9-e7402caac075-metrics-client-ca\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.717248 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.717218 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-main-tls\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.717601 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.717265 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.717601 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.717318 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4frc\" (UniqueName: \"kubernetes.io/projected/d34029b6-2a7d-4953-95a9-e7402caac075-kube-api-access-h4frc\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.717601 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.717345 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy-web\") pod \"d34029b6-2a7d-4953-95a9-e7402caac075\" (UID: \"d34029b6-2a7d-4953-95a9-e7402caac075\") " Apr 17 17:26:51.717601 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.717399 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34029b6-2a7d-4953-95a9-e7402caac075-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:51.717827 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.717612 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34029b6-2a7d-4953-95a9-e7402caac075-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.720046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.719855 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d34029b6-2a7d-4953-95a9-e7402caac075-config-out" (OuterVolumeSpecName: "config-out") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:26:51.720469 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.720244 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34029b6-2a7d-4953-95a9-e7402caac075-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:51.720469 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.720253 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:51.720469 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.720256 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d34029b6-2a7d-4953-95a9-e7402caac075-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:26:51.720469 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.720294 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:51.720717 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.720465 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:51.720717 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.720526 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-config-volume" (OuterVolumeSpecName: "config-volume") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:51.720717 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.720592 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34029b6-2a7d-4953-95a9-e7402caac075-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:51.722525 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.722497 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:51.722619 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.722596 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34029b6-2a7d-4953-95a9-e7402caac075-kube-api-access-h4frc" (OuterVolumeSpecName: "kube-api-access-h4frc") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "kube-api-access-h4frc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:51.724602 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.724578 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:51.731206 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.731184 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-web-config" (OuterVolumeSpecName: "web-config") pod "d34029b6-2a7d-4953-95a9-e7402caac075" (UID: "d34029b6-2a7d-4953-95a9-e7402caac075"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:51.818481 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.818425 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h4frc\" (UniqueName: \"kubernetes.io/projected/d34029b6-2a7d-4953-95a9-e7402caac075-kube-api-access-h4frc\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.818481 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.818476 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.818481 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.818492 2565 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-web-config\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.818739 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.818505 2565 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d34029b6-2a7d-4953-95a9-e7402caac075-config-out\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.818739 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.818519 2565 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-cluster-tls-config\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.818739 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.818531 2565 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d34029b6-2a7d-4953-95a9-e7402caac075-tls-assets\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.818739 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.818542 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.818739 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.818555 2565 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-config-volume\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.818739 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.818569 2565 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d34029b6-2a7d-4953-95a9-e7402caac075-alertmanager-main-db\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.818739 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.818580 2565 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d34029b6-2a7d-4953-95a9-e7402caac075-metrics-client-ca\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.818739 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.818592 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-main-tls\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.818739 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.818604 2565 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d34029b6-2a7d-4953-95a9-e7402caac075-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:26:51.944212 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.944176 2565 generic.go:358] "Generic (PLEG): container finished" podID="d34029b6-2a7d-4953-95a9-e7402caac075" containerID="8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89" exitCode=0 Apr 17 17:26:51.944404 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.944263 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerDied","Data":"8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89"} Apr 17 17:26:51.944404 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.944303 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:51.944404 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.944317 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d34029b6-2a7d-4953-95a9-e7402caac075","Type":"ContainerDied","Data":"e1b136d304ebc757b68ca0cceb6a970ee1c49812a28f94f16466e9c29787bfbd"} Apr 17 17:26:51.944404 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.944335 2565 scope.go:117] "RemoveContainer" containerID="08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0" Apr 17 17:26:51.952383 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.952305 2565 scope.go:117] "RemoveContainer" containerID="07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2" Apr 17 17:26:51.961736 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.961710 2565 scope.go:117] "RemoveContainer" containerID="7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad" Apr 17 17:26:51.968926 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.968864 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:26:51.969001 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.968932 2565 scope.go:117] "RemoveContainer" containerID="8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89" Apr 17 17:26:51.972898 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.972871 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:26:51.976276 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.976255 2565 scope.go:117] "RemoveContainer" containerID="b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9" Apr 17 17:26:51.982593 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.982576 2565 scope.go:117] "RemoveContainer" containerID="b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c" Apr 17 17:26:51.989729 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.989707 2565 scope.go:117] "RemoveContainer" containerID="1a82883bc13ebe9a2f6429e317e2f289c1382016cb53ee673ec579c6fcca8295" Apr 17 17:26:51.996247 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.996230 2565 scope.go:117] "RemoveContainer" containerID="08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0" Apr 17 17:26:51.997501 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:26:51.996646 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0\": container with ID starting with 08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0 not found: ID does not exist" containerID="08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0" Apr 17 17:26:51.997501 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.996689 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0"} err="failed to get container status \"08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0\": rpc error: code = NotFound desc = could not find container \"08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0\": container with ID starting with 08da91c099e4861155ae1f530369bf6db3e6ec43b0b366a8dbdf1ed13b4521d0 not found: ID does not exist" Apr 17 17:26:51.997501 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.996716 2565 scope.go:117] "RemoveContainer" containerID="07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2" Apr 17 17:26:51.997987 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:26:51.997913 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2\": container with ID starting with 07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2 not found: ID does not exist" containerID="07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2" Apr 17 17:26:51.998078 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.997987 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2"} err="failed to get container status \"07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2\": rpc error: code = NotFound desc = could not find container \"07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2\": container with ID starting with 07bcd83d2c7c6b4d6f9fe462cdef63e57720e020b10de5ef0966da2de901a3c2 not found: ID does not exist" Apr 17 17:26:51.998078 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.998013 2565 scope.go:117] "RemoveContainer" containerID="7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad" Apr 17 17:26:51.998366 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:26:51.998348 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad\": container with ID starting with 7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad not found: ID does not exist" containerID="7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad" Apr 17 17:26:51.998450 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.998370 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad"} err="failed to get container status \"7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad\": rpc error: code = NotFound desc = could not find container \"7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad\": container with ID starting with 7f6da3bdc2e275aa03fd15acdf075875c571acddf27999ee00d560fbfce5d4ad not found: ID does not exist" Apr 17 17:26:51.998450 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.998386 2565 scope.go:117] "RemoveContainer" containerID="8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89" Apr 17 17:26:51.998633 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:26:51.998608 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89\": container with ID starting with 8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89 not found: ID does not exist" containerID="8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89" Apr 17 17:26:51.998688 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.998637 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89"} err="failed to get container status \"8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89\": rpc error: code = NotFound desc = could not find container \"8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89\": container with ID starting with 8aa9d1f6afcb3ba13549beee1224f636f1e2e18ca460b344d934afa526398b89 not found: ID does not exist" Apr 17 17:26:51.998688 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.998652 2565 scope.go:117] "RemoveContainer" containerID="b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9" Apr 17 17:26:51.998923 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:26:51.998903 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9\": container with ID starting with b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9 not found: ID does not exist" containerID="b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9" Apr 17 17:26:51.999028 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.998928 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9"} err="failed to get container status \"b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9\": rpc error: code = NotFound desc = could not find container \"b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9\": container with ID starting with b46fd223940f98f6220e0f0e67bac95704c408d238d1a5e6325802fe1b8c18e9 not found: ID does not exist" Apr 17 17:26:51.999028 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.998946 2565 scope.go:117] "RemoveContainer" containerID="b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c" Apr 17 17:26:51.999028 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.998978 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:26:51.999182 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:26:51.999162 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c\": container with ID starting with b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c not found: ID does not exist" containerID="b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c" Apr 17 17:26:51.999235 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999181 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c"} err="failed to get container status \"b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c\": rpc error: code = NotFound desc = could not find container \"b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c\": container with ID starting with b2f47759742183ab71bc8bfce26b386f88d23ee96ca00e3d6cab649e498ce06c not found: ID does not exist" Apr 17 17:26:51.999235 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999194 2565 scope.go:117] "RemoveContainer" containerID="1a82883bc13ebe9a2f6429e317e2f289c1382016cb53ee673ec579c6fcca8295" Apr 17 17:26:51.999396 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:26:51.999369 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a82883bc13ebe9a2f6429e317e2f289c1382016cb53ee673ec579c6fcca8295\": container with ID starting with 1a82883bc13ebe9a2f6429e317e2f289c1382016cb53ee673ec579c6fcca8295 not found: ID does not exist" containerID="1a82883bc13ebe9a2f6429e317e2f289c1382016cb53ee673ec579c6fcca8295" Apr 17 17:26:51.999396 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999388 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a82883bc13ebe9a2f6429e317e2f289c1382016cb53ee673ec579c6fcca8295"} err="failed to get container status \"1a82883bc13ebe9a2f6429e317e2f289c1382016cb53ee673ec579c6fcca8295\": rpc error: code = NotFound desc = could not find container \"1a82883bc13ebe9a2f6429e317e2f289c1382016cb53ee673ec579c6fcca8295\": container with ID starting with 1a82883bc13ebe9a2f6429e317e2f289c1382016cb53ee673ec579c6fcca8295 not found: ID does not exist" Apr 17 17:26:51.999509 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999396 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="prom-label-proxy" Apr 17 17:26:51.999509 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999414 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="prom-label-proxy" Apr 17 17:26:51.999509 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999428 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="kube-rbac-proxy" Apr 17 17:26:51.999509 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999436 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="kube-rbac-proxy" Apr 17 17:26:51.999509 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999451 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="config-reloader" Apr 17 17:26:51.999509 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999460 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="config-reloader" Apr 17 17:26:51.999509 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999470 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="kube-rbac-proxy-metric" Apr 17 17:26:51.999509 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999478 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="kube-rbac-proxy-metric" Apr 17 17:26:51.999509 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999491 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="kube-rbac-proxy-web" Apr 17 17:26:51.999509 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999499 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="kube-rbac-proxy-web" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999515 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="init-config-reloader" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999524 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="init-config-reloader" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999539 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" containerName="registry" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999548 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" containerName="registry" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999562 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="alertmanager" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999570 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="alertmanager" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999649 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="kube-rbac-proxy-metric" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999662 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="prom-label-proxy" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999676 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="kube-rbac-proxy-web" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999687 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="alertmanager" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999698 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="kube-rbac-proxy" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999709 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="72bd8fbf-5bf3-44ea-a9b7-8d6863a2a0f0" containerName="registry" Apr 17 17:26:51.999992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:51.999720 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" containerName="config-reloader" Apr 17 17:26:52.004912 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.004894 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.007935 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.007917 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 17:26:52.008084 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.007917 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 17:26:52.008084 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.007917 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 17:26:52.008208 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.007922 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 17:26:52.008208 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.007923 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 17:26:52.008208 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.007931 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-cp7h2\"" Apr 17 17:26:52.008208 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.007947 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 17:26:52.008208 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.007964 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 17:26:52.008423 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.007989 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 17:26:52.013150 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.013131 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 17:26:52.021282 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.021263 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:26:52.120712 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.120671 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.120712 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.120717 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.121003 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.120735 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5b2566b1-eed4-4934-ad70-704adac645ac-config-out\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.121003 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.120817 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.121003 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.120880 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b2566b1-eed4-4934-ad70-704adac645ac-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.121003 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.120932 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-web-config\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.121003 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.120962 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf42p\" (UniqueName: \"kubernetes.io/projected/5b2566b1-eed4-4934-ad70-704adac645ac-kube-api-access-hf42p\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.121003 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.120990 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b2566b1-eed4-4934-ad70-704adac645ac-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.121238 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.121018 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5b2566b1-eed4-4934-ad70-704adac645ac-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.121238 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.121034 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5b2566b1-eed4-4934-ad70-704adac645ac-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.121238 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.121054 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.121238 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.121112 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.121238 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.121140 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-config-volume\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.221829 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.221730 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.221829 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.221787 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.221829 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.221807 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-config-volume\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.221829 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.221859 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.222180 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.222015 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.222180 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.222068 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5b2566b1-eed4-4934-ad70-704adac645ac-config-out\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.222180 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.222101 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.222180 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.222125 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b2566b1-eed4-4934-ad70-704adac645ac-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.222937 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.222910 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b2566b1-eed4-4934-ad70-704adac645ac-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.223071 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.222974 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-web-config\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.223071 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.223009 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hf42p\" (UniqueName: \"kubernetes.io/projected/5b2566b1-eed4-4934-ad70-704adac645ac-kube-api-access-hf42p\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.223071 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.223055 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b2566b1-eed4-4934-ad70-704adac645ac-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.223237 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.223099 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5b2566b1-eed4-4934-ad70-704adac645ac-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.223237 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.223125 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5b2566b1-eed4-4934-ad70-704adac645ac-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.223435 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.223413 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5b2566b1-eed4-4934-ad70-704adac645ac-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.225268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.224958 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.225268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.225074 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.225268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.225114 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.225268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.225135 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.225268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.225141 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-config-volume\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.225268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.225172 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.225268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.225215 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5b2566b1-eed4-4934-ad70-704adac645ac-config-out\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.225757 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.225736 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5b2566b1-eed4-4934-ad70-704adac645ac-web-config\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.225917 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.225897 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b2566b1-eed4-4934-ad70-704adac645ac-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.226892 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.226874 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5b2566b1-eed4-4934-ad70-704adac645ac-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.233285 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.233268 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf42p\" (UniqueName: \"kubernetes.io/projected/5b2566b1-eed4-4934-ad70-704adac645ac-kube-api-access-hf42p\") pod \"alertmanager-main-0\" (UID: \"5b2566b1-eed4-4934-ad70-704adac645ac\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.314568 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.314527 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:52.443769 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.443744 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:26:52.445734 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:26:52.445700 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2566b1_eed4_4934_ad70_704adac645ac.slice/crio-c6a1e7a9cce5d52395bdd43d0217ba01e50947b96cb8c59eb40ddb9f812f8779 WatchSource:0}: Error finding container c6a1e7a9cce5d52395bdd43d0217ba01e50947b96cb8c59eb40ddb9f812f8779: Status 404 returned error can't find the container with id c6a1e7a9cce5d52395bdd43d0217ba01e50947b96cb8c59eb40ddb9f812f8779 Apr 17 17:26:52.948915 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.948879 2565 generic.go:358] "Generic (PLEG): container finished" podID="5b2566b1-eed4-4934-ad70-704adac645ac" containerID="a58190c134ef8d322d85f92fc416de2bf129fa8d6b26a0df81e1f77037e42799" exitCode=0 Apr 17 17:26:52.949368 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.948967 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5b2566b1-eed4-4934-ad70-704adac645ac","Type":"ContainerDied","Data":"a58190c134ef8d322d85f92fc416de2bf129fa8d6b26a0df81e1f77037e42799"} Apr 17 17:26:52.949368 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:52.949006 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5b2566b1-eed4-4934-ad70-704adac645ac","Type":"ContainerStarted","Data":"c6a1e7a9cce5d52395bdd43d0217ba01e50947b96cb8c59eb40ddb9f812f8779"} Apr 17 17:26:53.194740 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:53.194365 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34029b6-2a7d-4953-95a9-e7402caac075" path="/var/lib/kubelet/pods/d34029b6-2a7d-4953-95a9-e7402caac075/volumes" Apr 17 17:26:53.955890 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:53.955854 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5b2566b1-eed4-4934-ad70-704adac645ac","Type":"ContainerStarted","Data":"e9f95a493d6abae3aa91b7503371bd5b3dc2302065b55ef8a7d3345a94af45c7"} Apr 17 17:26:53.955890 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:53.955892 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5b2566b1-eed4-4934-ad70-704adac645ac","Type":"ContainerStarted","Data":"48e0949f024675ec95e45738d6c31979c49f66cc6df1839cedcb6637b95d3fcc"} Apr 17 17:26:53.956328 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:53.955904 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5b2566b1-eed4-4934-ad70-704adac645ac","Type":"ContainerStarted","Data":"6a0e72c7b18e610d9d9dc3fde6c1e55b4e79db675ca97cd1f8d0f52a0edfd9fa"} Apr 17 17:26:53.956328 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:53.955916 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5b2566b1-eed4-4934-ad70-704adac645ac","Type":"ContainerStarted","Data":"b5d32243dda288ca34616e06e9c9b846f8d689c251231402b82dd82a8180c4be"} Apr 17 17:26:53.956328 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:53.955926 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5b2566b1-eed4-4934-ad70-704adac645ac","Type":"ContainerStarted","Data":"0155e89944f449135218397cae42a7fd0770efb2633d0100c2f69a431915a4fe"} Apr 17 17:26:53.956328 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:53.955938 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5b2566b1-eed4-4934-ad70-704adac645ac","Type":"ContainerStarted","Data":"15baa55ba7d43fc19b835fed0d62a573c6254fc44e252ecb77cc650bd356a5cb"} Apr 17 17:26:53.996186 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:53.996124 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.996107698 podStartE2EDuration="2.996107698s" podCreationTimestamp="2026-04-17 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:53.994973516 +0000 UTC m=+163.327591184" watchObservedRunningTime="2026-04-17 17:26:53.996107698 +0000 UTC m=+163.328725358" Apr 17 17:26:54.296084 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.295981 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-648b7db8cc-wk4m9"] Apr 17 17:26:54.299627 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.299596 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.302539 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.302513 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 17:26:54.302539 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.302528 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 17:26:54.302759 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.302577 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 17:26:54.302759 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.302635 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 17:26:54.302759 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.302682 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 17:26:54.302917 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.302789 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-96txl\"" Apr 17 17:26:54.308973 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.308951 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 17:26:54.316482 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.316457 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-648b7db8cc-wk4m9"] Apr 17 17:26:54.442708 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.442669 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lb5f\" (UniqueName: \"kubernetes.io/projected/5ce76d2b-04a7-463e-bd7c-8f6237da5475-kube-api-access-6lb5f\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.442919 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.442725 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5ce76d2b-04a7-463e-bd7c-8f6237da5475-federate-client-tls\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.442919 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.442769 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ce76d2b-04a7-463e-bd7c-8f6237da5475-metrics-client-ca\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.442919 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.442805 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5ce76d2b-04a7-463e-bd7c-8f6237da5475-telemeter-client-tls\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.442919 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.442873 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5ce76d2b-04a7-463e-bd7c-8f6237da5475-secret-telemeter-client\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.442919 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.442892 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ce76d2b-04a7-463e-bd7c-8f6237da5475-telemeter-trusted-ca-bundle\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.443086 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.442927 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ce76d2b-04a7-463e-bd7c-8f6237da5475-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.443086 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.442963 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ce76d2b-04a7-463e-bd7c-8f6237da5475-serving-certs-ca-bundle\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.544140 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.544109 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lb5f\" (UniqueName: \"kubernetes.io/projected/5ce76d2b-04a7-463e-bd7c-8f6237da5475-kube-api-access-6lb5f\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.544304 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.544154 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5ce76d2b-04a7-463e-bd7c-8f6237da5475-federate-client-tls\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.544304 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.544271 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ce76d2b-04a7-463e-bd7c-8f6237da5475-metrics-client-ca\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.544393 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.544323 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5ce76d2b-04a7-463e-bd7c-8f6237da5475-telemeter-client-tls\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.544393 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.544365 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5ce76d2b-04a7-463e-bd7c-8f6237da5475-secret-telemeter-client\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.544393 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.544382 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ce76d2b-04a7-463e-bd7c-8f6237da5475-telemeter-trusted-ca-bundle\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.544514 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.544425 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ce76d2b-04a7-463e-bd7c-8f6237da5475-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.544514 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.544457 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ce76d2b-04a7-463e-bd7c-8f6237da5475-serving-certs-ca-bundle\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.545064 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.545038 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ce76d2b-04a7-463e-bd7c-8f6237da5475-metrics-client-ca\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.545268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.545159 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ce76d2b-04a7-463e-bd7c-8f6237da5475-serving-certs-ca-bundle\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.545268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.545254 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ce76d2b-04a7-463e-bd7c-8f6237da5475-telemeter-trusted-ca-bundle\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.546749 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.546693 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5ce76d2b-04a7-463e-bd7c-8f6237da5475-federate-client-tls\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.546868 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.546772 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5ce76d2b-04a7-463e-bd7c-8f6237da5475-telemeter-client-tls\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.546964 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.546946 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5ce76d2b-04a7-463e-bd7c-8f6237da5475-secret-telemeter-client\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.547048 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.547028 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ce76d2b-04a7-463e-bd7c-8f6237da5475-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.552788 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.552771 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lb5f\" (UniqueName: \"kubernetes.io/projected/5ce76d2b-04a7-463e-bd7c-8f6237da5475-kube-api-access-6lb5f\") pod \"telemeter-client-648b7db8cc-wk4m9\" (UID: \"5ce76d2b-04a7-463e-bd7c-8f6237da5475\") " pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.609919 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.609892 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" Apr 17 17:26:54.941748 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.941722 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-648b7db8cc-wk4m9"] Apr 17 17:26:54.943592 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:26:54.943567 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ce76d2b_04a7_463e_bd7c_8f6237da5475.slice/crio-0d4ce11427fac141b46fbbbc161a44dd0dd2653aaaf5634d6e45a80365d0d6e1 WatchSource:0}: Error finding container 0d4ce11427fac141b46fbbbc161a44dd0dd2653aaaf5634d6e45a80365d0d6e1: Status 404 returned error can't find the container with id 0d4ce11427fac141b46fbbbc161a44dd0dd2653aaaf5634d6e45a80365d0d6e1 Apr 17 17:26:54.961676 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:54.961642 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" event={"ID":"5ce76d2b-04a7-463e-bd7c-8f6237da5475","Type":"ContainerStarted","Data":"0d4ce11427fac141b46fbbbc161a44dd0dd2653aaaf5634d6e45a80365d0d6e1"} Apr 17 17:26:56.970911 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:56.970797 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" event={"ID":"5ce76d2b-04a7-463e-bd7c-8f6237da5475","Type":"ContainerStarted","Data":"7860455c656d9f9004db11fcc1afb4911d97e71650d20f95f1b1b308cbd2a896"} Apr 17 17:26:56.970911 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:56.970870 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" event={"ID":"5ce76d2b-04a7-463e-bd7c-8f6237da5475","Type":"ContainerStarted","Data":"cdafac719f447026911f78699552e3eb945a3457c3cc040a3470ba774d50d0eb"} Apr 17 17:26:56.970911 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:56.970882 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" event={"ID":"5ce76d2b-04a7-463e-bd7c-8f6237da5475","Type":"ContainerStarted","Data":"6816a72743cae352f99db0d51324013e0c02ea6117be46bb5d7aa1664c2d6777"} Apr 17 17:26:57.009580 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:26:57.009525 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-648b7db8cc-wk4m9" podStartSLOduration=1.30999181 podStartE2EDuration="3.009510153s" podCreationTimestamp="2026-04-17 17:26:54 +0000 UTC" firstStartedPulling="2026-04-17 17:26:54.945534634 +0000 UTC m=+164.278152263" lastFinishedPulling="2026-04-17 17:26:56.645052979 +0000 UTC m=+165.977670606" observedRunningTime="2026-04-17 17:26:57.007100548 +0000 UTC m=+166.339718194" watchObservedRunningTime="2026-04-17 17:26:57.009510153 +0000 UTC m=+166.342127795" Apr 17 17:27:08.609950 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:08.609915 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b4868b6cb-hh4p7"] Apr 17 17:27:33.632251 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.632190 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5b4868b6cb-hh4p7" podUID="1ef0b39b-a2e0-4990-aa29-abeff0c4301b" containerName="console" containerID="cri-o://de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943" gracePeriod=15 Apr 17 17:27:33.866520 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.866499 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b4868b6cb-hh4p7_1ef0b39b-a2e0-4990-aa29-abeff0c4301b/console/0.log" Apr 17 17:27:33.866641 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.866557 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:27:33.894822 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.894750 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv85w\" (UniqueName: \"kubernetes.io/projected/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-kube-api-access-vv85w\") pod \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " Apr 17 17:27:33.894822 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.894813 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-trusted-ca-bundle\") pod \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " Apr 17 17:27:33.895026 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.894920 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-serving-cert\") pod \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " Apr 17 17:27:33.895026 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.894949 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-oauth-config\") pod \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " Apr 17 17:27:33.895026 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.894980 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-service-ca\") pod \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " Apr 17 17:27:33.895026 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.895007 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-config\") pod \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " Apr 17 17:27:33.895220 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.895060 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-oauth-serving-cert\") pod \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\" (UID: \"1ef0b39b-a2e0-4990-aa29-abeff0c4301b\") " Apr 17 17:27:33.895445 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.895335 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-service-ca" (OuterVolumeSpecName: "service-ca") pod "1ef0b39b-a2e0-4990-aa29-abeff0c4301b" (UID: "1ef0b39b-a2e0-4990-aa29-abeff0c4301b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:33.895526 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.895426 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-config" (OuterVolumeSpecName: "console-config") pod "1ef0b39b-a2e0-4990-aa29-abeff0c4301b" (UID: "1ef0b39b-a2e0-4990-aa29-abeff0c4301b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:33.895526 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.895493 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1ef0b39b-a2e0-4990-aa29-abeff0c4301b" (UID: "1ef0b39b-a2e0-4990-aa29-abeff0c4301b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:33.895640 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.895620 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1ef0b39b-a2e0-4990-aa29-abeff0c4301b" (UID: "1ef0b39b-a2e0-4990-aa29-abeff0c4301b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:33.897221 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.897193 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1ef0b39b-a2e0-4990-aa29-abeff0c4301b" (UID: "1ef0b39b-a2e0-4990-aa29-abeff0c4301b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:27:33.897476 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.897455 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-kube-api-access-vv85w" (OuterVolumeSpecName: "kube-api-access-vv85w") pod "1ef0b39b-a2e0-4990-aa29-abeff0c4301b" (UID: "1ef0b39b-a2e0-4990-aa29-abeff0c4301b"). InnerVolumeSpecName "kube-api-access-vv85w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:27:33.897527 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.897455 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1ef0b39b-a2e0-4990-aa29-abeff0c4301b" (UID: "1ef0b39b-a2e0-4990-aa29-abeff0c4301b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:27:33.996662 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.996624 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-trusted-ca-bundle\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:27:33.996662 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.996658 2565 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-serving-cert\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:27:33.996662 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.996672 2565 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-oauth-config\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:27:33.996929 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.996699 2565 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-service-ca\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:27:33.996929 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.996711 2565 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-console-config\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:27:33.996929 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.996722 2565 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-oauth-serving-cert\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:27:33.996929 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:33.996734 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vv85w\" (UniqueName: \"kubernetes.io/projected/1ef0b39b-a2e0-4990-aa29-abeff0c4301b-kube-api-access-vv85w\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:27:34.088625 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:34.088599 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b4868b6cb-hh4p7_1ef0b39b-a2e0-4990-aa29-abeff0c4301b/console/0.log" Apr 17 17:27:34.088790 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:34.088641 2565 generic.go:358] "Generic (PLEG): container finished" podID="1ef0b39b-a2e0-4990-aa29-abeff0c4301b" containerID="de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943" exitCode=2 Apr 17 17:27:34.088790 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:34.088680 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b4868b6cb-hh4p7" event={"ID":"1ef0b39b-a2e0-4990-aa29-abeff0c4301b","Type":"ContainerDied","Data":"de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943"} Apr 17 17:27:34.088790 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:34.088712 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b4868b6cb-hh4p7" Apr 17 17:27:34.088790 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:34.088727 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b4868b6cb-hh4p7" event={"ID":"1ef0b39b-a2e0-4990-aa29-abeff0c4301b","Type":"ContainerDied","Data":"3600fea5d7482c8f6178f018ad9c12e607b0ee4116e8133e033fb8728a07fbd4"} Apr 17 17:27:34.088790 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:34.088748 2565 scope.go:117] "RemoveContainer" containerID="de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943" Apr 17 17:27:34.097657 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:34.097639 2565 scope.go:117] "RemoveContainer" containerID="de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943" Apr 17 17:27:34.097990 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:27:34.097969 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943\": container with ID starting with de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943 not found: ID does not exist" containerID="de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943" Apr 17 17:27:34.098077 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:34.098010 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943"} err="failed to get container status \"de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943\": rpc error: code = NotFound desc = could not find container \"de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943\": container with ID starting with de91357861cad08708be028471500a8ade3e7b49a966138c6276c34d9b358943 not found: ID does not exist" Apr 17 17:27:34.110238 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:34.110217 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b4868b6cb-hh4p7"] Apr 17 17:27:34.115999 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:34.115980 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b4868b6cb-hh4p7"] Apr 17 17:27:35.193377 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:27:35.193345 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef0b39b-a2e0-4990-aa29-abeff0c4301b" path="/var/lib/kubelet/pods/1ef0b39b-a2e0-4990-aa29-abeff0c4301b/volumes" Apr 17 17:28:21.368606 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.368568 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f6c86f79d-qxg9m"] Apr 17 17:28:21.369089 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.368933 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ef0b39b-a2e0-4990-aa29-abeff0c4301b" containerName="console" Apr 17 17:28:21.369089 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.368946 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef0b39b-a2e0-4990-aa29-abeff0c4301b" containerName="console" Apr 17 17:28:21.369089 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.369021 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ef0b39b-a2e0-4990-aa29-abeff0c4301b" containerName="console" Apr 17 17:28:21.373168 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.373145 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.375656 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.375628 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:28:21.375789 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.375664 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-p4d47\"" Apr 17 17:28:21.375789 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.375712 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:28:21.376549 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.376533 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:28:21.376778 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.376761 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:28:21.377004 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.376984 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:28:21.382094 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.381446 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:28:21.382601 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.382580 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f6c86f79d-qxg9m"] Apr 17 17:28:21.501174 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.501136 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-service-ca\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.501325 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.501199 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b32c4eb4-4269-4758-9645-d2483932b25e-console-serving-cert\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.501325 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.501217 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-trusted-ca-bundle\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.501325 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.501241 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b32c4eb4-4269-4758-9645-d2483932b25e-console-oauth-config\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.501325 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.501268 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-console-config\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.501458 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.501336 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvl7t\" (UniqueName: \"kubernetes.io/projected/b32c4eb4-4269-4758-9645-d2483932b25e-kube-api-access-xvl7t\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.501458 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.501374 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-oauth-serving-cert\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.602047 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.601999 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b32c4eb4-4269-4758-9645-d2483932b25e-console-oauth-config\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.602192 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.602071 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-console-config\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.602192 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.602103 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvl7t\" (UniqueName: \"kubernetes.io/projected/b32c4eb4-4269-4758-9645-d2483932b25e-kube-api-access-xvl7t\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.602192 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.602130 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-oauth-serving-cert\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.602192 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.602169 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-service-ca\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.602392 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.602239 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b32c4eb4-4269-4758-9645-d2483932b25e-console-serving-cert\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.602392 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.602263 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-trusted-ca-bundle\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.602891 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.602863 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-console-config\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.603010 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.602956 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-oauth-serving-cert\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.603010 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.602994 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-service-ca\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.603126 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.603108 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-trusted-ca-bundle\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.604624 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.604598 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b32c4eb4-4269-4758-9645-d2483932b25e-console-oauth-config\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.604706 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.604656 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b32c4eb4-4269-4758-9645-d2483932b25e-console-serving-cert\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.610104 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.610087 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvl7t\" (UniqueName: \"kubernetes.io/projected/b32c4eb4-4269-4758-9645-d2483932b25e-kube-api-access-xvl7t\") pod \"console-5f6c86f79d-qxg9m\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.684158 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.684077 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:21.810832 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:21.810810 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f6c86f79d-qxg9m"] Apr 17 17:28:21.813883 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:28:21.813766 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb32c4eb4_4269_4758_9645_d2483932b25e.slice/crio-952827768f7dbc42b6a5190b12f04bd79fb76a7161c5ef663e11194ff4113fc1 WatchSource:0}: Error finding container 952827768f7dbc42b6a5190b12f04bd79fb76a7161c5ef663e11194ff4113fc1: Status 404 returned error can't find the container with id 952827768f7dbc42b6a5190b12f04bd79fb76a7161c5ef663e11194ff4113fc1 Apr 17 17:28:22.231525 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:22.231487 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6c86f79d-qxg9m" event={"ID":"b32c4eb4-4269-4758-9645-d2483932b25e","Type":"ContainerStarted","Data":"d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6"} Apr 17 17:28:22.231525 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:22.231523 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6c86f79d-qxg9m" event={"ID":"b32c4eb4-4269-4758-9645-d2483932b25e","Type":"ContainerStarted","Data":"952827768f7dbc42b6a5190b12f04bd79fb76a7161c5ef663e11194ff4113fc1"} Apr 17 17:28:22.249985 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:22.249936 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f6c86f79d-qxg9m" podStartSLOduration=1.249922374 podStartE2EDuration="1.249922374s" podCreationTimestamp="2026-04-17 17:28:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:28:22.249434079 +0000 UTC m=+251.582051724" watchObservedRunningTime="2026-04-17 17:28:22.249922374 +0000 UTC m=+251.582540019" Apr 17 17:28:31.684378 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:31.684291 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:31.684378 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:31.684329 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:31.688979 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:31.688954 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:32.271408 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:32.271379 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:28:52.659501 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.659466 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt"] Apr 17 17:28:52.663421 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.663402 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:28:52.665887 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.665863 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:28:52.665994 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.665926 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-l6dq5\"" Apr 17 17:28:52.666787 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.666773 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:28:52.672790 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.672765 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt"] Apr 17 17:28:52.771171 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.771133 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlbjk\" (UniqueName: \"kubernetes.io/projected/b6256dbe-e786-44c2-b787-08948cc88139-kube-api-access-qlbjk\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt\" (UID: \"b6256dbe-e786-44c2-b787-08948cc88139\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:28:52.771345 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.771271 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6256dbe-e786-44c2-b787-08948cc88139-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt\" (UID: \"b6256dbe-e786-44c2-b787-08948cc88139\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:28:52.771345 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.771307 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6256dbe-e786-44c2-b787-08948cc88139-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt\" (UID: \"b6256dbe-e786-44c2-b787-08948cc88139\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:28:52.871888 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.871823 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6256dbe-e786-44c2-b787-08948cc88139-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt\" (UID: \"b6256dbe-e786-44c2-b787-08948cc88139\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:28:52.872070 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.871910 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6256dbe-e786-44c2-b787-08948cc88139-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt\" (UID: \"b6256dbe-e786-44c2-b787-08948cc88139\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:28:52.872070 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.871949 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlbjk\" (UniqueName: \"kubernetes.io/projected/b6256dbe-e786-44c2-b787-08948cc88139-kube-api-access-qlbjk\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt\" (UID: \"b6256dbe-e786-44c2-b787-08948cc88139\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:28:52.872222 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.872199 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6256dbe-e786-44c2-b787-08948cc88139-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt\" (UID: \"b6256dbe-e786-44c2-b787-08948cc88139\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:28:52.872279 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.872222 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6256dbe-e786-44c2-b787-08948cc88139-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt\" (UID: \"b6256dbe-e786-44c2-b787-08948cc88139\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:28:52.880049 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.880024 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlbjk\" (UniqueName: \"kubernetes.io/projected/b6256dbe-e786-44c2-b787-08948cc88139-kube-api-access-qlbjk\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt\" (UID: \"b6256dbe-e786-44c2-b787-08948cc88139\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:28:52.973355 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:52.973258 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:28:53.093278 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:53.093255 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt"] Apr 17 17:28:53.095862 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:28:53.095812 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6256dbe_e786_44c2_b787_08948cc88139.slice/crio-80f548e425b710c6ab2f18ef0f6693ff7d3dd75604aedff79d3eeb74af154f03 WatchSource:0}: Error finding container 80f548e425b710c6ab2f18ef0f6693ff7d3dd75604aedff79d3eeb74af154f03: Status 404 returned error can't find the container with id 80f548e425b710c6ab2f18ef0f6693ff7d3dd75604aedff79d3eeb74af154f03 Apr 17 17:28:53.328759 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:53.328724 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" event={"ID":"b6256dbe-e786-44c2-b787-08948cc88139","Type":"ContainerStarted","Data":"80f548e425b710c6ab2f18ef0f6693ff7d3dd75604aedff79d3eeb74af154f03"} Apr 17 17:28:59.351530 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:59.351494 2565 generic.go:358] "Generic (PLEG): container finished" podID="b6256dbe-e786-44c2-b787-08948cc88139" containerID="805d11943f78072c4b52a9c36256a253d00b5f99bf4202e1e04fc5ed12184f2a" exitCode=0 Apr 17 17:28:59.351939 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:28:59.351588 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" event={"ID":"b6256dbe-e786-44c2-b787-08948cc88139","Type":"ContainerDied","Data":"805d11943f78072c4b52a9c36256a253d00b5f99bf4202e1e04fc5ed12184f2a"} Apr 17 17:29:01.360887 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:01.360791 2565 generic.go:358] "Generic (PLEG): container finished" podID="b6256dbe-e786-44c2-b787-08948cc88139" containerID="96505f73a543671cfc8b03c91514169a1bedbe6bc5e76e70f01c93e3f7966f35" exitCode=0 Apr 17 17:29:01.361220 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:01.360879 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" event={"ID":"b6256dbe-e786-44c2-b787-08948cc88139","Type":"ContainerDied","Data":"96505f73a543671cfc8b03c91514169a1bedbe6bc5e76e70f01c93e3f7966f35"} Apr 17 17:29:08.386300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:08.386267 2565 generic.go:358] "Generic (PLEG): container finished" podID="b6256dbe-e786-44c2-b787-08948cc88139" containerID="0f312bbdeb38ec8e7955641b54c11201201e92532cb25bf88135cce0345ca849" exitCode=0 Apr 17 17:29:08.386684 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:08.386338 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" event={"ID":"b6256dbe-e786-44c2-b787-08948cc88139","Type":"ContainerDied","Data":"0f312bbdeb38ec8e7955641b54c11201201e92532cb25bf88135cce0345ca849"} Apr 17 17:29:09.512761 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:09.512738 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:29:09.622291 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:09.622247 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlbjk\" (UniqueName: \"kubernetes.io/projected/b6256dbe-e786-44c2-b787-08948cc88139-kube-api-access-qlbjk\") pod \"b6256dbe-e786-44c2-b787-08948cc88139\" (UID: \"b6256dbe-e786-44c2-b787-08948cc88139\") " Apr 17 17:29:09.622505 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:09.622324 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6256dbe-e786-44c2-b787-08948cc88139-util\") pod \"b6256dbe-e786-44c2-b787-08948cc88139\" (UID: \"b6256dbe-e786-44c2-b787-08948cc88139\") " Apr 17 17:29:09.622505 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:09.622365 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6256dbe-e786-44c2-b787-08948cc88139-bundle\") pod \"b6256dbe-e786-44c2-b787-08948cc88139\" (UID: \"b6256dbe-e786-44c2-b787-08948cc88139\") " Apr 17 17:29:09.622947 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:09.622920 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6256dbe-e786-44c2-b787-08948cc88139-bundle" (OuterVolumeSpecName: "bundle") pod "b6256dbe-e786-44c2-b787-08948cc88139" (UID: "b6256dbe-e786-44c2-b787-08948cc88139"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:29:09.624574 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:09.624551 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6256dbe-e786-44c2-b787-08948cc88139-kube-api-access-qlbjk" (OuterVolumeSpecName: "kube-api-access-qlbjk") pod "b6256dbe-e786-44c2-b787-08948cc88139" (UID: "b6256dbe-e786-44c2-b787-08948cc88139"). InnerVolumeSpecName "kube-api-access-qlbjk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:29:09.626199 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:09.626180 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6256dbe-e786-44c2-b787-08948cc88139-util" (OuterVolumeSpecName: "util") pod "b6256dbe-e786-44c2-b787-08948cc88139" (UID: "b6256dbe-e786-44c2-b787-08948cc88139"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:29:09.723497 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:09.723401 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6256dbe-e786-44c2-b787-08948cc88139-util\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:29:09.723497 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:09.723444 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6256dbe-e786-44c2-b787-08948cc88139-bundle\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:29:09.723497 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:09.723455 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qlbjk\" (UniqueName: \"kubernetes.io/projected/b6256dbe-e786-44c2-b787-08948cc88139-kube-api-access-qlbjk\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:29:10.393815 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:10.393776 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" event={"ID":"b6256dbe-e786-44c2-b787-08948cc88139","Type":"ContainerDied","Data":"80f548e425b710c6ab2f18ef0f6693ff7d3dd75604aedff79d3eeb74af154f03"} Apr 17 17:29:10.394006 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:10.393823 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80f548e425b710c6ab2f18ef0f6693ff7d3dd75604aedff79d3eeb74af154f03" Apr 17 17:29:10.394006 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:10.393794 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cjlkbt" Apr 17 17:29:11.110223 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:11.110197 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:29:11.110679 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:11.110232 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:29:11.119748 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:11.119723 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:29:14.621550 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.621513 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4"] Apr 17 17:29:14.663791 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.621869 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6256dbe-e786-44c2-b787-08948cc88139" containerName="util" Apr 17 17:29:14.663791 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.621882 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6256dbe-e786-44c2-b787-08948cc88139" containerName="util" Apr 17 17:29:14.663791 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.621901 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6256dbe-e786-44c2-b787-08948cc88139" containerName="extract" Apr 17 17:29:14.663791 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.621906 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6256dbe-e786-44c2-b787-08948cc88139" containerName="extract" Apr 17 17:29:14.663791 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.621926 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b6256dbe-e786-44c2-b787-08948cc88139" containerName="pull" Apr 17 17:29:14.663791 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.621932 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6256dbe-e786-44c2-b787-08948cc88139" containerName="pull" Apr 17 17:29:14.663791 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.621990 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="b6256dbe-e786-44c2-b787-08948cc88139" containerName="extract" Apr 17 17:29:14.663791 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.663748 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4"] Apr 17 17:29:14.664164 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.663890 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" Apr 17 17:29:14.666774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.666752 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 17:29:14.666942 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.666808 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-ppwwk\"" Apr 17 17:29:14.666942 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.666819 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 17:29:14.666942 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.666754 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 17:29:14.771333 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.771294 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq4r2\" (UniqueName: \"kubernetes.io/projected/59ba5528-9411-4179-af2b-fb0a44d3ad77-kube-api-access-cq4r2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4\" (UID: \"59ba5528-9411-4179-af2b-fb0a44d3ad77\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" Apr 17 17:29:14.771521 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.771346 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/59ba5528-9411-4179-af2b-fb0a44d3ad77-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4\" (UID: \"59ba5528-9411-4179-af2b-fb0a44d3ad77\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" Apr 17 17:29:14.872329 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.872250 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cq4r2\" (UniqueName: \"kubernetes.io/projected/59ba5528-9411-4179-af2b-fb0a44d3ad77-kube-api-access-cq4r2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4\" (UID: \"59ba5528-9411-4179-af2b-fb0a44d3ad77\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" Apr 17 17:29:14.872329 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.872306 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/59ba5528-9411-4179-af2b-fb0a44d3ad77-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4\" (UID: \"59ba5528-9411-4179-af2b-fb0a44d3ad77\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" Apr 17 17:29:14.874589 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.874560 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/59ba5528-9411-4179-af2b-fb0a44d3ad77-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4\" (UID: \"59ba5528-9411-4179-af2b-fb0a44d3ad77\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" Apr 17 17:29:14.880620 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.880581 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq4r2\" (UniqueName: \"kubernetes.io/projected/59ba5528-9411-4179-af2b-fb0a44d3ad77-kube-api-access-cq4r2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4\" (UID: \"59ba5528-9411-4179-af2b-fb0a44d3ad77\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" Apr 17 17:29:14.976657 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:14.976618 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" Apr 17 17:29:15.102702 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:15.102677 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4"] Apr 17 17:29:15.105511 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:29:15.105479 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59ba5528_9411_4179_af2b_fb0a44d3ad77.slice/crio-abfe1200ee5404b19c6b95827aa8ddb483d6648294cfe303c5c38e8d716ee534 WatchSource:0}: Error finding container abfe1200ee5404b19c6b95827aa8ddb483d6648294cfe303c5c38e8d716ee534: Status 404 returned error can't find the container with id abfe1200ee5404b19c6b95827aa8ddb483d6648294cfe303c5c38e8d716ee534 Apr 17 17:29:15.107350 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:15.107327 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:29:15.410799 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:15.410766 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" event={"ID":"59ba5528-9411-4179-af2b-fb0a44d3ad77","Type":"ContainerStarted","Data":"abfe1200ee5404b19c6b95827aa8ddb483d6648294cfe303c5c38e8d716ee534"} Apr 17 17:29:22.386406 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.386371 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v"] Apr 17 17:29:22.389815 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.389790 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:22.392375 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.392354 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 17:29:22.393590 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.393568 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-sb8rf\"" Apr 17 17:29:22.393665 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.393568 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 17:29:22.400518 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.400485 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v"] Apr 17 17:29:22.432820 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.432791 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/45e60926-c374-4147-9759-fdb51b823af9-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qb47v\" (UID: \"45e60926-c374-4147-9759-fdb51b823af9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:22.433007 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.432868 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/45e60926-c374-4147-9759-fdb51b823af9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qb47v\" (UID: \"45e60926-c374-4147-9759-fdb51b823af9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:22.433007 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.432903 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz94c\" (UniqueName: \"kubernetes.io/projected/45e60926-c374-4147-9759-fdb51b823af9-kube-api-access-zz94c\") pod \"keda-metrics-apiserver-7c9f485588-qb47v\" (UID: \"45e60926-c374-4147-9759-fdb51b823af9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:22.435682 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.435647 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" event={"ID":"59ba5528-9411-4179-af2b-fb0a44d3ad77","Type":"ContainerStarted","Data":"6d7d9fcac8aa21c5cdb68493f9b644c7a6a94eb28e8a129b917d36eb58eb11e3"} Apr 17 17:29:22.435872 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.435720 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" Apr 17 17:29:22.457051 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.456992 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" podStartSLOduration=2.019826813 podStartE2EDuration="8.456976683s" podCreationTimestamp="2026-04-17 17:29:14 +0000 UTC" firstStartedPulling="2026-04-17 17:29:15.107523277 +0000 UTC m=+304.440140904" lastFinishedPulling="2026-04-17 17:29:21.544673141 +0000 UTC m=+310.877290774" observedRunningTime="2026-04-17 17:29:22.45425596 +0000 UTC m=+311.786873607" watchObservedRunningTime="2026-04-17 17:29:22.456976683 +0000 UTC m=+311.789594330" Apr 17 17:29:22.534318 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.534276 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/45e60926-c374-4147-9759-fdb51b823af9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qb47v\" (UID: \"45e60926-c374-4147-9759-fdb51b823af9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:22.534516 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.534346 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz94c\" (UniqueName: \"kubernetes.io/projected/45e60926-c374-4147-9759-fdb51b823af9-kube-api-access-zz94c\") pod \"keda-metrics-apiserver-7c9f485588-qb47v\" (UID: \"45e60926-c374-4147-9759-fdb51b823af9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:22.534516 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:29:22.534372 2565 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:29:22.534516 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:29:22.534398 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:29:22.534516 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.534405 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/45e60926-c374-4147-9759-fdb51b823af9-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qb47v\" (UID: \"45e60926-c374-4147-9759-fdb51b823af9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:22.534516 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:29:22.534418 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v: references non-existent secret key: tls.crt Apr 17 17:29:22.534516 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:29:22.534479 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45e60926-c374-4147-9759-fdb51b823af9-certificates podName:45e60926-c374-4147-9759-fdb51b823af9 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:23.034460136 +0000 UTC m=+312.367077774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/45e60926-c374-4147-9759-fdb51b823af9-certificates") pod "keda-metrics-apiserver-7c9f485588-qb47v" (UID: "45e60926-c374-4147-9759-fdb51b823af9") : references non-existent secret key: tls.crt Apr 17 17:29:22.534810 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.534793 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/45e60926-c374-4147-9759-fdb51b823af9-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qb47v\" (UID: \"45e60926-c374-4147-9759-fdb51b823af9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:22.543166 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:22.543133 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz94c\" (UniqueName: \"kubernetes.io/projected/45e60926-c374-4147-9759-fdb51b823af9-kube-api-access-zz94c\") pod \"keda-metrics-apiserver-7c9f485588-qb47v\" (UID: \"45e60926-c374-4147-9759-fdb51b823af9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:23.040314 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:23.040272 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/45e60926-c374-4147-9759-fdb51b823af9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qb47v\" (UID: \"45e60926-c374-4147-9759-fdb51b823af9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:23.042821 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:23.042792 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/45e60926-c374-4147-9759-fdb51b823af9-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qb47v\" (UID: \"45e60926-c374-4147-9759-fdb51b823af9\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:23.301712 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:23.301688 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:23.443124 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:23.443101 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v"] Apr 17 17:29:23.444760 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:29:23.444732 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45e60926_c374_4147_9759_fdb51b823af9.slice/crio-9ebfce5121dc9b6a32fab89c0c6ca3856893424021edbe8cfbf98d3cbd0f4290 WatchSource:0}: Error finding container 9ebfce5121dc9b6a32fab89c0c6ca3856893424021edbe8cfbf98d3cbd0f4290: Status 404 returned error can't find the container with id 9ebfce5121dc9b6a32fab89c0c6ca3856893424021edbe8cfbf98d3cbd0f4290 Apr 17 17:29:24.444040 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:24.444004 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" event={"ID":"45e60926-c374-4147-9759-fdb51b823af9","Type":"ContainerStarted","Data":"9ebfce5121dc9b6a32fab89c0c6ca3856893424021edbe8cfbf98d3cbd0f4290"} Apr 17 17:29:26.454441 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:26.454407 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" event={"ID":"45e60926-c374-4147-9759-fdb51b823af9","Type":"ContainerStarted","Data":"cb6cc8792a719742073a81978f9496c81beaff64d666eae9c77144ea0d462637"} Apr 17 17:29:26.454830 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:26.454521 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:37.462852 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:37.462806 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" Apr 17 17:29:37.480051 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:37.480003 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qb47v" podStartSLOduration=13.304134634 podStartE2EDuration="15.479988633s" podCreationTimestamp="2026-04-17 17:29:22 +0000 UTC" firstStartedPulling="2026-04-17 17:29:23.446128557 +0000 UTC m=+312.778746180" lastFinishedPulling="2026-04-17 17:29:25.621982551 +0000 UTC m=+314.954600179" observedRunningTime="2026-04-17 17:29:26.47146977 +0000 UTC m=+315.804087417" watchObservedRunningTime="2026-04-17 17:29:37.479988633 +0000 UTC m=+326.812606279" Apr 17 17:29:43.441715 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:29:43.441677 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvcx4" Apr 17 17:30:17.459818 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.459723 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz"] Apr 17 17:30:17.467572 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.467545 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:17.470104 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.470075 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-l6dq5\"" Apr 17 17:30:17.470104 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.470091 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:30:17.471098 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.471074 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:30:17.471370 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.471339 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz"] Apr 17 17:30:17.512788 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.512744 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2npks\" (UniqueName: \"kubernetes.io/projected/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-kube-api-access-2npks\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz\" (UID: \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:17.512992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.512870 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz\" (UID: \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:17.512992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.512932 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz\" (UID: \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:17.614250 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.614210 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2npks\" (UniqueName: \"kubernetes.io/projected/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-kube-api-access-2npks\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz\" (UID: \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:17.614439 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.614283 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz\" (UID: \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:17.614439 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.614329 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz\" (UID: \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:17.614670 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.614650 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz\" (UID: \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:17.614707 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.614696 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz\" (UID: \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:17.623712 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.623674 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2npks\" (UniqueName: \"kubernetes.io/projected/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-kube-api-access-2npks\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz\" (UID: \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:17.779644 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.779558 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:17.908228 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:17.908201 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz"] Apr 17 17:30:17.910448 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:30:17.910417 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8cd6197_10f5_4569_ab9f_8b3c76d07d13.slice/crio-1586785531ad9a6481e8a71db4af6456b56ae586df3b040df6db957a9e3bb8c7 WatchSource:0}: Error finding container 1586785531ad9a6481e8a71db4af6456b56ae586df3b040df6db957a9e3bb8c7: Status 404 returned error can't find the container with id 1586785531ad9a6481e8a71db4af6456b56ae586df3b040df6db957a9e3bb8c7 Apr 17 17:30:18.632936 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:18.632906 2565 generic.go:358] "Generic (PLEG): container finished" podID="b8cd6197-10f5-4569-ab9f-8b3c76d07d13" containerID="600245128ef56bbdc82dbb3dd440f42941404aadacc1c202b108b0be0eb87375" exitCode=0 Apr 17 17:30:18.633337 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:18.632956 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" event={"ID":"b8cd6197-10f5-4569-ab9f-8b3c76d07d13","Type":"ContainerDied","Data":"600245128ef56bbdc82dbb3dd440f42941404aadacc1c202b108b0be0eb87375"} Apr 17 17:30:18.633337 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:18.632981 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" event={"ID":"b8cd6197-10f5-4569-ab9f-8b3c76d07d13","Type":"ContainerStarted","Data":"1586785531ad9a6481e8a71db4af6456b56ae586df3b040df6db957a9e3bb8c7"} Apr 17 17:30:19.638460 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:19.638367 2565 generic.go:358] "Generic (PLEG): container finished" podID="b8cd6197-10f5-4569-ab9f-8b3c76d07d13" containerID="3d94da557d1609a78bf96b6a633ac0326887a043b7b9e6b12d853eb9c35a81f7" exitCode=0 Apr 17 17:30:19.638460 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:19.638443 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" event={"ID":"b8cd6197-10f5-4569-ab9f-8b3c76d07d13","Type":"ContainerDied","Data":"3d94da557d1609a78bf96b6a633ac0326887a043b7b9e6b12d853eb9c35a81f7"} Apr 17 17:30:20.644346 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:20.644310 2565 generic.go:358] "Generic (PLEG): container finished" podID="b8cd6197-10f5-4569-ab9f-8b3c76d07d13" containerID="4348bc32c7fd363d177a4cd16e7c3eac4cfb4726ceb024bc7cc48b3243b8e629" exitCode=0 Apr 17 17:30:20.644721 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:20.644395 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" event={"ID":"b8cd6197-10f5-4569-ab9f-8b3c76d07d13","Type":"ContainerDied","Data":"4348bc32c7fd363d177a4cd16e7c3eac4cfb4726ceb024bc7cc48b3243b8e629"} Apr 17 17:30:21.772506 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:21.772479 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:21.849812 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:21.849769 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-bundle\") pod \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\" (UID: \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\") " Apr 17 17:30:21.850007 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:21.849822 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2npks\" (UniqueName: \"kubernetes.io/projected/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-kube-api-access-2npks\") pod \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\" (UID: \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\") " Apr 17 17:30:21.850007 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:21.849914 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-util\") pod \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\" (UID: \"b8cd6197-10f5-4569-ab9f-8b3c76d07d13\") " Apr 17 17:30:21.850589 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:21.850561 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-bundle" (OuterVolumeSpecName: "bundle") pod "b8cd6197-10f5-4569-ab9f-8b3c76d07d13" (UID: "b8cd6197-10f5-4569-ab9f-8b3c76d07d13"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:30:21.851981 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:21.851956 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-kube-api-access-2npks" (OuterVolumeSpecName: "kube-api-access-2npks") pod "b8cd6197-10f5-4569-ab9f-8b3c76d07d13" (UID: "b8cd6197-10f5-4569-ab9f-8b3c76d07d13"). InnerVolumeSpecName "kube-api-access-2npks". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:30:21.855630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:21.855597 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-util" (OuterVolumeSpecName: "util") pod "b8cd6197-10f5-4569-ab9f-8b3c76d07d13" (UID: "b8cd6197-10f5-4569-ab9f-8b3c76d07d13"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:30:21.950784 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:21.950692 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-util\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:30:21.950784 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:21.950727 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-bundle\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:30:21.950784 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:21.950744 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2npks\" (UniqueName: \"kubernetes.io/projected/b8cd6197-10f5-4569-ab9f-8b3c76d07d13-kube-api-access-2npks\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:30:22.653252 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:22.653221 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" Apr 17 17:30:22.653252 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:22.653227 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54tbmz" event={"ID":"b8cd6197-10f5-4569-ab9f-8b3c76d07d13","Type":"ContainerDied","Data":"1586785531ad9a6481e8a71db4af6456b56ae586df3b040df6db957a9e3bb8c7"} Apr 17 17:30:22.653252 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:22.653259 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1586785531ad9a6481e8a71db4af6456b56ae586df3b040df6db957a9e3bb8c7" Apr 17 17:30:29.914644 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.914608 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l"] Apr 17 17:30:29.915027 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.915015 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8cd6197-10f5-4569-ab9f-8b3c76d07d13" containerName="util" Apr 17 17:30:29.915027 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.915028 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8cd6197-10f5-4569-ab9f-8b3c76d07d13" containerName="util" Apr 17 17:30:29.915109 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.915050 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8cd6197-10f5-4569-ab9f-8b3c76d07d13" containerName="extract" Apr 17 17:30:29.915109 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.915059 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8cd6197-10f5-4569-ab9f-8b3c76d07d13" containerName="extract" Apr 17 17:30:29.915109 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.915074 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8cd6197-10f5-4569-ab9f-8b3c76d07d13" containerName="pull" Apr 17 17:30:29.915109 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.915080 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8cd6197-10f5-4569-ab9f-8b3c76d07d13" containerName="pull" Apr 17 17:30:29.915223 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.915129 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8cd6197-10f5-4569-ab9f-8b3c76d07d13" containerName="extract" Apr 17 17:30:29.918146 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.918126 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l" Apr 17 17:30:29.921477 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.921453 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 17:30:29.921596 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.921453 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:30:29.921596 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.921524 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-7jjxr\"" Apr 17 17:30:29.929381 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:29.929354 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l"] Apr 17 17:30:30.019181 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:30.019124 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9khl6\" (UniqueName: \"kubernetes.io/projected/f54e195f-7152-4484-923e-4a6df394e804-kube-api-access-9khl6\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mkz6l\" (UID: \"f54e195f-7152-4484-923e-4a6df394e804\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l" Apr 17 17:30:30.019350 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:30.019246 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f54e195f-7152-4484-923e-4a6df394e804-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mkz6l\" (UID: \"f54e195f-7152-4484-923e-4a6df394e804\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l" Apr 17 17:30:30.120368 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:30.120322 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9khl6\" (UniqueName: \"kubernetes.io/projected/f54e195f-7152-4484-923e-4a6df394e804-kube-api-access-9khl6\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mkz6l\" (UID: \"f54e195f-7152-4484-923e-4a6df394e804\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l" Apr 17 17:30:30.120558 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:30.120404 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f54e195f-7152-4484-923e-4a6df394e804-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mkz6l\" (UID: \"f54e195f-7152-4484-923e-4a6df394e804\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l" Apr 17 17:30:30.120878 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:30.120856 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f54e195f-7152-4484-923e-4a6df394e804-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mkz6l\" (UID: \"f54e195f-7152-4484-923e-4a6df394e804\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l" Apr 17 17:30:30.129104 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:30.129077 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9khl6\" (UniqueName: \"kubernetes.io/projected/f54e195f-7152-4484-923e-4a6df394e804-kube-api-access-9khl6\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mkz6l\" (UID: \"f54e195f-7152-4484-923e-4a6df394e804\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l" Apr 17 17:30:30.227598 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:30.227506 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l" Apr 17 17:30:30.363920 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:30.363816 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l"] Apr 17 17:30:30.367128 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:30:30.367098 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf54e195f_7152_4484_923e_4a6df394e804.slice/crio-506fabf9167555d4c6c83558ca50b864e85ae3b08520d8074574016ef32d8af1 WatchSource:0}: Error finding container 506fabf9167555d4c6c83558ca50b864e85ae3b08520d8074574016ef32d8af1: Status 404 returned error can't find the container with id 506fabf9167555d4c6c83558ca50b864e85ae3b08520d8074574016ef32d8af1 Apr 17 17:30:30.681874 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:30.681822 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l" event={"ID":"f54e195f-7152-4484-923e-4a6df394e804","Type":"ContainerStarted","Data":"506fabf9167555d4c6c83558ca50b864e85ae3b08520d8074574016ef32d8af1"} Apr 17 17:30:33.695142 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:33.695105 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l" event={"ID":"f54e195f-7152-4484-923e-4a6df394e804","Type":"ContainerStarted","Data":"5c73c06bdf2a6a7ccc6148f0228c7f80f38af17cbf985bf4427591cde2a6c784"} Apr 17 17:30:33.717442 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:33.717386 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mkz6l" podStartSLOduration=2.116121573 podStartE2EDuration="4.717367s" podCreationTimestamp="2026-04-17 17:30:29 +0000 UTC" firstStartedPulling="2026-04-17 17:30:30.369623571 +0000 UTC m=+379.702241195" lastFinishedPulling="2026-04-17 17:30:32.970868995 +0000 UTC m=+382.303486622" observedRunningTime="2026-04-17 17:30:33.716915478 +0000 UTC m=+383.049533121" watchObservedRunningTime="2026-04-17 17:30:33.717367 +0000 UTC m=+383.049984648" Apr 17 17:30:39.167218 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.167185 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-pmwdh"] Apr 17 17:30:39.170630 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.170612 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-pmwdh" Apr 17 17:30:39.173235 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.173208 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 17:30:39.173360 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.173251 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-8jmhg\"" Apr 17 17:30:39.173360 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.173274 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 17:30:39.178335 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.178302 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-pmwdh"] Apr 17 17:30:39.300564 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.300528 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a77e37e-96b5-4383-a37c-aa26cd5f39ff-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-pmwdh\" (UID: \"3a77e37e-96b5-4383-a37c-aa26cd5f39ff\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pmwdh" Apr 17 17:30:39.300716 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.300585 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w68xv\" (UniqueName: \"kubernetes.io/projected/3a77e37e-96b5-4383-a37c-aa26cd5f39ff-kube-api-access-w68xv\") pod \"cert-manager-cainjector-8966b78d4-pmwdh\" (UID: \"3a77e37e-96b5-4383-a37c-aa26cd5f39ff\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pmwdh" Apr 17 17:30:39.366915 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.366861 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j"] Apr 17 17:30:39.372257 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.372230 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:39.375077 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.375049 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:30:39.375235 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.375080 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-l6dq5\"" Apr 17 17:30:39.375235 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.375171 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:30:39.377363 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.377338 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j"] Apr 17 17:30:39.401140 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.401104 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w68xv\" (UniqueName: \"kubernetes.io/projected/3a77e37e-96b5-4383-a37c-aa26cd5f39ff-kube-api-access-w68xv\") pod \"cert-manager-cainjector-8966b78d4-pmwdh\" (UID: \"3a77e37e-96b5-4383-a37c-aa26cd5f39ff\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pmwdh" Apr 17 17:30:39.401303 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.401202 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a77e37e-96b5-4383-a37c-aa26cd5f39ff-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-pmwdh\" (UID: \"3a77e37e-96b5-4383-a37c-aa26cd5f39ff\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pmwdh" Apr 17 17:30:39.409239 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.409214 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a77e37e-96b5-4383-a37c-aa26cd5f39ff-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-pmwdh\" (UID: \"3a77e37e-96b5-4383-a37c-aa26cd5f39ff\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pmwdh" Apr 17 17:30:39.409372 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.409360 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w68xv\" (UniqueName: \"kubernetes.io/projected/3a77e37e-96b5-4383-a37c-aa26cd5f39ff-kube-api-access-w68xv\") pod \"cert-manager-cainjector-8966b78d4-pmwdh\" (UID: \"3a77e37e-96b5-4383-a37c-aa26cd5f39ff\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pmwdh" Apr 17 17:30:39.490507 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.490418 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-pmwdh" Apr 17 17:30:39.502485 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.502452 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdl75\" (UniqueName: \"kubernetes.io/projected/a70cfa0b-6fda-45af-9ed4-34ddf807809d-kube-api-access-sdl75\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j\" (UID: \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:39.502598 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.502500 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a70cfa0b-6fda-45af-9ed4-34ddf807809d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j\" (UID: \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:39.502672 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.502649 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a70cfa0b-6fda-45af-9ed4-34ddf807809d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j\" (UID: \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:39.604103 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.604072 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a70cfa0b-6fda-45af-9ed4-34ddf807809d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j\" (UID: \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:39.604246 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.604124 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdl75\" (UniqueName: \"kubernetes.io/projected/a70cfa0b-6fda-45af-9ed4-34ddf807809d-kube-api-access-sdl75\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j\" (UID: \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:39.604246 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.604157 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a70cfa0b-6fda-45af-9ed4-34ddf807809d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j\" (UID: \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:39.604430 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.604410 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a70cfa0b-6fda-45af-9ed4-34ddf807809d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j\" (UID: \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:39.604486 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.604469 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a70cfa0b-6fda-45af-9ed4-34ddf807809d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j\" (UID: \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:39.614255 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.614232 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-pmwdh"] Apr 17 17:30:39.615166 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.615143 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdl75\" (UniqueName: \"kubernetes.io/projected/a70cfa0b-6fda-45af-9ed4-34ddf807809d-kube-api-access-sdl75\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j\" (UID: \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:39.616703 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:30:39.616668 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a77e37e_96b5_4383_a37c_aa26cd5f39ff.slice/crio-7f4ddccdb339c9452985452f7ebe12858822ddb5ac4cb2ffb553e2617115c831 WatchSource:0}: Error finding container 7f4ddccdb339c9452985452f7ebe12858822ddb5ac4cb2ffb553e2617115c831: Status 404 returned error can't find the container with id 7f4ddccdb339c9452985452f7ebe12858822ddb5ac4cb2ffb553e2617115c831 Apr 17 17:30:39.682587 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.682549 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:39.717784 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.717739 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-pmwdh" event={"ID":"3a77e37e-96b5-4383-a37c-aa26cd5f39ff","Type":"ContainerStarted","Data":"7f4ddccdb339c9452985452f7ebe12858822ddb5ac4cb2ffb553e2617115c831"} Apr 17 17:30:39.832623 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:39.832548 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j"] Apr 17 17:30:40.722680 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:40.722642 2565 generic.go:358] "Generic (PLEG): container finished" podID="a70cfa0b-6fda-45af-9ed4-34ddf807809d" containerID="1cd46b9754caa5df7afd07a41e7c6fc0c5d61e46bf8a3c45d7c28e2dcc9081ef" exitCode=0 Apr 17 17:30:40.723113 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:40.722714 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" event={"ID":"a70cfa0b-6fda-45af-9ed4-34ddf807809d","Type":"ContainerDied","Data":"1cd46b9754caa5df7afd07a41e7c6fc0c5d61e46bf8a3c45d7c28e2dcc9081ef"} Apr 17 17:30:40.723113 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:40.722743 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" event={"ID":"a70cfa0b-6fda-45af-9ed4-34ddf807809d","Type":"ContainerStarted","Data":"b8afceffc10520ce9fa4598e025c0f73b7be7998af221876321a34f36b8225d0"} Apr 17 17:30:44.746501 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:44.746460 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-pmwdh" event={"ID":"3a77e37e-96b5-4383-a37c-aa26cd5f39ff","Type":"ContainerStarted","Data":"60c3e459bb93390ae36628041259a702d36a3d5137f8b5d351962a881298bb12"} Apr 17 17:30:44.747978 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:44.747949 2565 generic.go:358] "Generic (PLEG): container finished" podID="a70cfa0b-6fda-45af-9ed4-34ddf807809d" containerID="cf7c1e9f1f7bb71104a6d5ddcd0d770c0c7a8e02c6e65263a774d47a2918587f" exitCode=0 Apr 17 17:30:44.748112 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:44.748014 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" event={"ID":"a70cfa0b-6fda-45af-9ed4-34ddf807809d","Type":"ContainerDied","Data":"cf7c1e9f1f7bb71104a6d5ddcd0d770c0c7a8e02c6e65263a774d47a2918587f"} Apr 17 17:30:44.766136 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:44.766078 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-pmwdh" podStartSLOduration=1.756772064 podStartE2EDuration="5.766052462s" podCreationTimestamp="2026-04-17 17:30:39 +0000 UTC" firstStartedPulling="2026-04-17 17:30:39.618513523 +0000 UTC m=+388.951131146" lastFinishedPulling="2026-04-17 17:30:43.627793916 +0000 UTC m=+392.960411544" observedRunningTime="2026-04-17 17:30:44.762463345 +0000 UTC m=+394.095080984" watchObservedRunningTime="2026-04-17 17:30:44.766052462 +0000 UTC m=+394.098670108" Apr 17 17:30:45.753133 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:45.753094 2565 generic.go:358] "Generic (PLEG): container finished" podID="a70cfa0b-6fda-45af-9ed4-34ddf807809d" containerID="2456a14489093c21595ba85e4bf7158a1fd96a045a74fafa81421d7745967d9e" exitCode=0 Apr 17 17:30:45.753587 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:45.753181 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" event={"ID":"a70cfa0b-6fda-45af-9ed4-34ddf807809d","Type":"ContainerDied","Data":"2456a14489093c21595ba85e4bf7158a1fd96a045a74fafa81421d7745967d9e"} Apr 17 17:30:46.891424 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:46.891397 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:46.973350 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:46.973314 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a70cfa0b-6fda-45af-9ed4-34ddf807809d-bundle\") pod \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\" (UID: \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\") " Apr 17 17:30:46.973496 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:46.973434 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdl75\" (UniqueName: \"kubernetes.io/projected/a70cfa0b-6fda-45af-9ed4-34ddf807809d-kube-api-access-sdl75\") pod \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\" (UID: \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\") " Apr 17 17:30:46.973554 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:46.973499 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a70cfa0b-6fda-45af-9ed4-34ddf807809d-util\") pod \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\" (UID: \"a70cfa0b-6fda-45af-9ed4-34ddf807809d\") " Apr 17 17:30:46.973829 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:46.973809 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a70cfa0b-6fda-45af-9ed4-34ddf807809d-bundle" (OuterVolumeSpecName: "bundle") pod "a70cfa0b-6fda-45af-9ed4-34ddf807809d" (UID: "a70cfa0b-6fda-45af-9ed4-34ddf807809d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:30:46.975552 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:46.975524 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a70cfa0b-6fda-45af-9ed4-34ddf807809d-kube-api-access-sdl75" (OuterVolumeSpecName: "kube-api-access-sdl75") pod "a70cfa0b-6fda-45af-9ed4-34ddf807809d" (UID: "a70cfa0b-6fda-45af-9ed4-34ddf807809d"). InnerVolumeSpecName "kube-api-access-sdl75". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:30:46.978025 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:46.977993 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a70cfa0b-6fda-45af-9ed4-34ddf807809d-util" (OuterVolumeSpecName: "util") pod "a70cfa0b-6fda-45af-9ed4-34ddf807809d" (UID: "a70cfa0b-6fda-45af-9ed4-34ddf807809d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:30:47.074511 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:47.074457 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a70cfa0b-6fda-45af-9ed4-34ddf807809d-bundle\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:30:47.074511 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:47.074510 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sdl75\" (UniqueName: \"kubernetes.io/projected/a70cfa0b-6fda-45af-9ed4-34ddf807809d-kube-api-access-sdl75\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:30:47.074511 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:47.074522 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a70cfa0b-6fda-45af-9ed4-34ddf807809d-util\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:30:47.762393 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:47.762357 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" event={"ID":"a70cfa0b-6fda-45af-9ed4-34ddf807809d","Type":"ContainerDied","Data":"b8afceffc10520ce9fa4598e025c0f73b7be7998af221876321a34f36b8225d0"} Apr 17 17:30:47.762393 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:47.762390 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8afceffc10520ce9fa4598e025c0f73b7be7998af221876321a34f36b8225d0" Apr 17 17:30:47.762623 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:47.762403 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fpc28j" Apr 17 17:30:54.901051 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:54.901016 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-k7kzd"] Apr 17 17:30:54.901510 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:54.901368 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a70cfa0b-6fda-45af-9ed4-34ddf807809d" containerName="extract" Apr 17 17:30:54.901510 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:54.901379 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70cfa0b-6fda-45af-9ed4-34ddf807809d" containerName="extract" Apr 17 17:30:54.901510 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:54.901391 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a70cfa0b-6fda-45af-9ed4-34ddf807809d" containerName="util" Apr 17 17:30:54.901510 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:54.901398 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70cfa0b-6fda-45af-9ed4-34ddf807809d" containerName="util" Apr 17 17:30:54.901510 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:54.901409 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a70cfa0b-6fda-45af-9ed4-34ddf807809d" containerName="pull" Apr 17 17:30:54.901510 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:54.901414 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70cfa0b-6fda-45af-9ed4-34ddf807809d" containerName="pull" Apr 17 17:30:54.901510 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:54.901468 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="a70cfa0b-6fda-45af-9ed4-34ddf807809d" containerName="extract" Apr 17 17:30:54.913647 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:54.913617 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-k7kzd"] Apr 17 17:30:54.913793 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:54.913764 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-k7kzd" Apr 17 17:30:54.916175 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:54.916153 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-8s25m\"" Apr 17 17:30:55.041172 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:55.041126 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d7b9d50-b487-46a4-963e-294669578bb0-bound-sa-token\") pod \"cert-manager-759f64656b-k7kzd\" (UID: \"1d7b9d50-b487-46a4-963e-294669578bb0\") " pod="cert-manager/cert-manager-759f64656b-k7kzd" Apr 17 17:30:55.041359 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:55.041283 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75p24\" (UniqueName: \"kubernetes.io/projected/1d7b9d50-b487-46a4-963e-294669578bb0-kube-api-access-75p24\") pod \"cert-manager-759f64656b-k7kzd\" (UID: \"1d7b9d50-b487-46a4-963e-294669578bb0\") " pod="cert-manager/cert-manager-759f64656b-k7kzd" Apr 17 17:30:55.142763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:55.142721 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d7b9d50-b487-46a4-963e-294669578bb0-bound-sa-token\") pod \"cert-manager-759f64656b-k7kzd\" (UID: \"1d7b9d50-b487-46a4-963e-294669578bb0\") " pod="cert-manager/cert-manager-759f64656b-k7kzd" Apr 17 17:30:55.142959 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:55.142808 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75p24\" (UniqueName: \"kubernetes.io/projected/1d7b9d50-b487-46a4-963e-294669578bb0-kube-api-access-75p24\") pod \"cert-manager-759f64656b-k7kzd\" (UID: \"1d7b9d50-b487-46a4-963e-294669578bb0\") " pod="cert-manager/cert-manager-759f64656b-k7kzd" Apr 17 17:30:55.151656 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:55.151585 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75p24\" (UniqueName: \"kubernetes.io/projected/1d7b9d50-b487-46a4-963e-294669578bb0-kube-api-access-75p24\") pod \"cert-manager-759f64656b-k7kzd\" (UID: \"1d7b9d50-b487-46a4-963e-294669578bb0\") " pod="cert-manager/cert-manager-759f64656b-k7kzd" Apr 17 17:30:55.151961 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:55.151944 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d7b9d50-b487-46a4-963e-294669578bb0-bound-sa-token\") pod \"cert-manager-759f64656b-k7kzd\" (UID: \"1d7b9d50-b487-46a4-963e-294669578bb0\") " pod="cert-manager/cert-manager-759f64656b-k7kzd" Apr 17 17:30:55.223315 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:55.223279 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-k7kzd" Apr 17 17:30:55.349618 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:55.349594 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-k7kzd"] Apr 17 17:30:55.351526 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:30:55.351500 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d7b9d50_b487_46a4_963e_294669578bb0.slice/crio-5165974a98f393eac20c9412376490d8ab11ec5015c46549cc9d1dc776a6e703 WatchSource:0}: Error finding container 5165974a98f393eac20c9412376490d8ab11ec5015c46549cc9d1dc776a6e703: Status 404 returned error can't find the container with id 5165974a98f393eac20c9412376490d8ab11ec5015c46549cc9d1dc776a6e703 Apr 17 17:30:55.791291 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:55.791200 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-k7kzd" event={"ID":"1d7b9d50-b487-46a4-963e-294669578bb0","Type":"ContainerStarted","Data":"8eb9cca9db8084e786209e8f489000ccb34c334ebf501abea6a59ece4434abf0"} Apr 17 17:30:55.791291 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:55.791243 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-k7kzd" event={"ID":"1d7b9d50-b487-46a4-963e-294669578bb0","Type":"ContainerStarted","Data":"5165974a98f393eac20c9412376490d8ab11ec5015c46549cc9d1dc776a6e703"} Apr 17 17:30:55.815300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:30:55.815243 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-k7kzd" podStartSLOduration=1.815223402 podStartE2EDuration="1.815223402s" podCreationTimestamp="2026-04-17 17:30:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:30:55.813686993 +0000 UTC m=+405.146304640" watchObservedRunningTime="2026-04-17 17:30:55.815223402 +0000 UTC m=+405.147841049" Apr 17 17:31:07.614781 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.614738 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs"] Apr 17 17:31:07.618720 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.618702 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:07.621501 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.621481 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:31:07.621599 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.621508 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:31:07.621599 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.621529 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-l6dq5\"" Apr 17 17:31:07.627096 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.627075 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs"] Apr 17 17:31:07.643732 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.643693 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac83d6a2-adf5-4dab-b210-bf965bd2584b-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs\" (UID: \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:07.643911 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.643814 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp988\" (UniqueName: \"kubernetes.io/projected/ac83d6a2-adf5-4dab-b210-bf965bd2584b-kube-api-access-bp988\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs\" (UID: \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:07.643911 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.643885 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac83d6a2-adf5-4dab-b210-bf965bd2584b-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs\" (UID: \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:07.744412 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.744371 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp988\" (UniqueName: \"kubernetes.io/projected/ac83d6a2-adf5-4dab-b210-bf965bd2584b-kube-api-access-bp988\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs\" (UID: \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:07.744606 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.744421 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac83d6a2-adf5-4dab-b210-bf965bd2584b-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs\" (UID: \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:07.744606 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.744469 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac83d6a2-adf5-4dab-b210-bf965bd2584b-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs\" (UID: \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:07.744883 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.744865 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac83d6a2-adf5-4dab-b210-bf965bd2584b-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs\" (UID: \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:07.744968 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.744905 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac83d6a2-adf5-4dab-b210-bf965bd2584b-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs\" (UID: \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:07.753155 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.753124 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp988\" (UniqueName: \"kubernetes.io/projected/ac83d6a2-adf5-4dab-b210-bf965bd2584b-kube-api-access-bp988\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs\" (UID: \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:07.929222 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:07.929140 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:08.054516 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:08.054492 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs"] Apr 17 17:31:08.056252 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:31:08.056211 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac83d6a2_adf5_4dab_b210_bf965bd2584b.slice/crio-6e3aba81fc9b485cddae479cbae8f29df33934b642c08c16a7311279334aea06 WatchSource:0}: Error finding container 6e3aba81fc9b485cddae479cbae8f29df33934b642c08c16a7311279334aea06: Status 404 returned error can't find the container with id 6e3aba81fc9b485cddae479cbae8f29df33934b642c08c16a7311279334aea06 Apr 17 17:31:08.840166 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:08.840128 2565 generic.go:358] "Generic (PLEG): container finished" podID="ac83d6a2-adf5-4dab-b210-bf965bd2584b" containerID="55acd6c4b092f833e87185b3590cba5419f0f654c189badbaed6c4be86b74493" exitCode=0 Apr 17 17:31:08.840629 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:08.840181 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" event={"ID":"ac83d6a2-adf5-4dab-b210-bf965bd2584b","Type":"ContainerDied","Data":"55acd6c4b092f833e87185b3590cba5419f0f654c189badbaed6c4be86b74493"} Apr 17 17:31:08.840629 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:08.840214 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" event={"ID":"ac83d6a2-adf5-4dab-b210-bf965bd2584b","Type":"ContainerStarted","Data":"6e3aba81fc9b485cddae479cbae8f29df33934b642c08c16a7311279334aea06"} Apr 17 17:31:09.845360 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:09.845328 2565 generic.go:358] "Generic (PLEG): container finished" podID="ac83d6a2-adf5-4dab-b210-bf965bd2584b" containerID="8f8bb442b978be9409bd3f7b26409c9500d61d9d0c9ae0660731b8476d9ff735" exitCode=0 Apr 17 17:31:09.845714 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:09.845411 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" event={"ID":"ac83d6a2-adf5-4dab-b210-bf965bd2584b","Type":"ContainerDied","Data":"8f8bb442b978be9409bd3f7b26409c9500d61d9d0c9ae0660731b8476d9ff735"} Apr 17 17:31:10.850463 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:10.850427 2565 generic.go:358] "Generic (PLEG): container finished" podID="ac83d6a2-adf5-4dab-b210-bf965bd2584b" containerID="0e8daab62aa737f9f73d2ef3ffa9bef0636b8e0997a1f99236f7b1891f87e324" exitCode=0 Apr 17 17:31:10.850822 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:10.850517 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" event={"ID":"ac83d6a2-adf5-4dab-b210-bf965bd2584b","Type":"ContainerDied","Data":"0e8daab62aa737f9f73d2ef3ffa9bef0636b8e0997a1f99236f7b1891f87e324"} Apr 17 17:31:11.989980 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:11.989953 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:12.078500 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:12.078448 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac83d6a2-adf5-4dab-b210-bf965bd2584b-util\") pod \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\" (UID: \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\") " Apr 17 17:31:12.078500 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:12.078503 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac83d6a2-adf5-4dab-b210-bf965bd2584b-bundle\") pod \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\" (UID: \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\") " Apr 17 17:31:12.078713 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:12.078567 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp988\" (UniqueName: \"kubernetes.io/projected/ac83d6a2-adf5-4dab-b210-bf965bd2584b-kube-api-access-bp988\") pod \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\" (UID: \"ac83d6a2-adf5-4dab-b210-bf965bd2584b\") " Apr 17 17:31:12.079602 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:12.079570 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac83d6a2-adf5-4dab-b210-bf965bd2584b-bundle" (OuterVolumeSpecName: "bundle") pod "ac83d6a2-adf5-4dab-b210-bf965bd2584b" (UID: "ac83d6a2-adf5-4dab-b210-bf965bd2584b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:12.080587 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:12.080558 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac83d6a2-adf5-4dab-b210-bf965bd2584b-kube-api-access-bp988" (OuterVolumeSpecName: "kube-api-access-bp988") pod "ac83d6a2-adf5-4dab-b210-bf965bd2584b" (UID: "ac83d6a2-adf5-4dab-b210-bf965bd2584b"). InnerVolumeSpecName "kube-api-access-bp988". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:12.084266 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:12.084227 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac83d6a2-adf5-4dab-b210-bf965bd2584b-util" (OuterVolumeSpecName: "util") pod "ac83d6a2-adf5-4dab-b210-bf965bd2584b" (UID: "ac83d6a2-adf5-4dab-b210-bf965bd2584b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:12.179524 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:12.179447 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac83d6a2-adf5-4dab-b210-bf965bd2584b-util\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:12.179524 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:12.179471 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac83d6a2-adf5-4dab-b210-bf965bd2584b-bundle\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:12.179524 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:12.179481 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bp988\" (UniqueName: \"kubernetes.io/projected/ac83d6a2-adf5-4dab-b210-bf965bd2584b-kube-api-access-bp988\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:12.860225 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:12.860194 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" event={"ID":"ac83d6a2-adf5-4dab-b210-bf965bd2584b","Type":"ContainerDied","Data":"6e3aba81fc9b485cddae479cbae8f29df33934b642c08c16a7311279334aea06"} Apr 17 17:31:12.860225 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:12.860225 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e3aba81fc9b485cddae479cbae8f29df33934b642c08c16a7311279334aea06" Apr 17 17:31:12.860442 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:12.860232 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835jxdbs" Apr 17 17:31:18.731284 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.731248 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj"] Apr 17 17:31:18.731683 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.731613 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac83d6a2-adf5-4dab-b210-bf965bd2584b" containerName="util" Apr 17 17:31:18.731683 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.731624 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac83d6a2-adf5-4dab-b210-bf965bd2584b" containerName="util" Apr 17 17:31:18.731683 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.731632 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac83d6a2-adf5-4dab-b210-bf965bd2584b" containerName="extract" Apr 17 17:31:18.731683 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.731638 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac83d6a2-adf5-4dab-b210-bf965bd2584b" containerName="extract" Apr 17 17:31:18.731683 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.731652 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac83d6a2-adf5-4dab-b210-bf965bd2584b" containerName="pull" Apr 17 17:31:18.731683 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.731657 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac83d6a2-adf5-4dab-b210-bf965bd2584b" containerName="pull" Apr 17 17:31:18.731912 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.731723 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac83d6a2-adf5-4dab-b210-bf965bd2584b" containerName="extract" Apr 17 17:31:18.736044 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.736024 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:18.740169 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.740139 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 17:31:18.740350 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.740174 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:31:18.740350 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.740141 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 17:31:18.740350 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.740178 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 17:31:18.740350 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.740228 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 17:31:18.740350 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.740137 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-sjwd4\"" Apr 17 17:31:18.746852 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.746818 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj"] Apr 17 17:31:18.829605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.829564 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdwtf\" (UniqueName: \"kubernetes.io/projected/34cb13f4-0a7d-4a07-9b4c-2e858b17357c-kube-api-access-vdwtf\") pod \"lws-controller-manager-6dd684f56d-x5nwj\" (UID: \"34cb13f4-0a7d-4a07-9b4c-2e858b17357c\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:18.829605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.829607 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/34cb13f4-0a7d-4a07-9b4c-2e858b17357c-manager-config\") pod \"lws-controller-manager-6dd684f56d-x5nwj\" (UID: \"34cb13f4-0a7d-4a07-9b4c-2e858b17357c\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:18.829825 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.829689 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/34cb13f4-0a7d-4a07-9b4c-2e858b17357c-metrics-cert\") pod \"lws-controller-manager-6dd684f56d-x5nwj\" (UID: \"34cb13f4-0a7d-4a07-9b4c-2e858b17357c\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:18.829825 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.829756 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34cb13f4-0a7d-4a07-9b4c-2e858b17357c-cert\") pod \"lws-controller-manager-6dd684f56d-x5nwj\" (UID: \"34cb13f4-0a7d-4a07-9b4c-2e858b17357c\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:18.930320 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.930278 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdwtf\" (UniqueName: \"kubernetes.io/projected/34cb13f4-0a7d-4a07-9b4c-2e858b17357c-kube-api-access-vdwtf\") pod \"lws-controller-manager-6dd684f56d-x5nwj\" (UID: \"34cb13f4-0a7d-4a07-9b4c-2e858b17357c\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:18.930320 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.930319 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/34cb13f4-0a7d-4a07-9b4c-2e858b17357c-manager-config\") pod \"lws-controller-manager-6dd684f56d-x5nwj\" (UID: \"34cb13f4-0a7d-4a07-9b4c-2e858b17357c\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:18.930569 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.930367 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/34cb13f4-0a7d-4a07-9b4c-2e858b17357c-metrics-cert\") pod \"lws-controller-manager-6dd684f56d-x5nwj\" (UID: \"34cb13f4-0a7d-4a07-9b4c-2e858b17357c\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:18.930569 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.930406 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34cb13f4-0a7d-4a07-9b4c-2e858b17357c-cert\") pod \"lws-controller-manager-6dd684f56d-x5nwj\" (UID: \"34cb13f4-0a7d-4a07-9b4c-2e858b17357c\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:18.931134 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.931107 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/34cb13f4-0a7d-4a07-9b4c-2e858b17357c-manager-config\") pod \"lws-controller-manager-6dd684f56d-x5nwj\" (UID: \"34cb13f4-0a7d-4a07-9b4c-2e858b17357c\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:18.932822 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.932803 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34cb13f4-0a7d-4a07-9b4c-2e858b17357c-cert\") pod \"lws-controller-manager-6dd684f56d-x5nwj\" (UID: \"34cb13f4-0a7d-4a07-9b4c-2e858b17357c\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:18.932925 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.932820 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/34cb13f4-0a7d-4a07-9b4c-2e858b17357c-metrics-cert\") pod \"lws-controller-manager-6dd684f56d-x5nwj\" (UID: \"34cb13f4-0a7d-4a07-9b4c-2e858b17357c\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:18.987888 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:18.987798 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdwtf\" (UniqueName: \"kubernetes.io/projected/34cb13f4-0a7d-4a07-9b4c-2e858b17357c-kube-api-access-vdwtf\") pod \"lws-controller-manager-6dd684f56d-x5nwj\" (UID: \"34cb13f4-0a7d-4a07-9b4c-2e858b17357c\") " pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:19.046046 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.046011 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:19.246643 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.246617 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj"] Apr 17 17:31:19.249303 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:31:19.249270 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34cb13f4_0a7d_4a07_9b4c_2e858b17357c.slice/crio-6c338acdc5dd200f699389fa67b02dbff01025f3d1fb3c7705d28dcd827fb5d4 WatchSource:0}: Error finding container 6c338acdc5dd200f699389fa67b02dbff01025f3d1fb3c7705d28dcd827fb5d4: Status 404 returned error can't find the container with id 6c338acdc5dd200f699389fa67b02dbff01025f3d1fb3c7705d28dcd827fb5d4 Apr 17 17:31:19.719134 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.719100 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj"] Apr 17 17:31:19.723016 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.722989 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:19.726940 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.726913 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-l6dq5\"" Apr 17 17:31:19.727220 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.727190 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:31:19.728252 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.728232 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:31:19.735351 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.735322 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24c32525-46b8-4c66-9357-a8b7230a5f20-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj\" (UID: \"24c32525-46b8-4c66-9357-a8b7230a5f20\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:19.735729 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.735364 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24c32525-46b8-4c66-9357-a8b7230a5f20-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj\" (UID: \"24c32525-46b8-4c66-9357-a8b7230a5f20\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:19.735729 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.735490 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvzq9\" (UniqueName: \"kubernetes.io/projected/24c32525-46b8-4c66-9357-a8b7230a5f20-kube-api-access-vvzq9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj\" (UID: \"24c32525-46b8-4c66-9357-a8b7230a5f20\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:19.741100 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.741073 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj"] Apr 17 17:31:19.835994 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.835957 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24c32525-46b8-4c66-9357-a8b7230a5f20-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj\" (UID: \"24c32525-46b8-4c66-9357-a8b7230a5f20\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:19.836195 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.835999 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24c32525-46b8-4c66-9357-a8b7230a5f20-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj\" (UID: \"24c32525-46b8-4c66-9357-a8b7230a5f20\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:19.836195 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.836103 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvzq9\" (UniqueName: \"kubernetes.io/projected/24c32525-46b8-4c66-9357-a8b7230a5f20-kube-api-access-vvzq9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj\" (UID: \"24c32525-46b8-4c66-9357-a8b7230a5f20\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:19.836435 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.836403 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24c32525-46b8-4c66-9357-a8b7230a5f20-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj\" (UID: \"24c32525-46b8-4c66-9357-a8b7230a5f20\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:19.836561 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.836482 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24c32525-46b8-4c66-9357-a8b7230a5f20-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj\" (UID: \"24c32525-46b8-4c66-9357-a8b7230a5f20\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:19.848188 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.848156 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvzq9\" (UniqueName: \"kubernetes.io/projected/24c32525-46b8-4c66-9357-a8b7230a5f20-kube-api-access-vvzq9\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj\" (UID: \"24c32525-46b8-4c66-9357-a8b7230a5f20\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:19.887451 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:19.887396 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" event={"ID":"34cb13f4-0a7d-4a07-9b4c-2e858b17357c","Type":"ContainerStarted","Data":"6c338acdc5dd200f699389fa67b02dbff01025f3d1fb3c7705d28dcd827fb5d4"} Apr 17 17:31:20.035348 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:20.035257 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:20.220359 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:20.220315 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj"] Apr 17 17:31:20.592592 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:31:20.592551 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24c32525_46b8_4c66_9357_a8b7230a5f20.slice/crio-42718be0ac7d8814430ff99ee09994adad89e21dca61dbf3943ada8abcb293ee WatchSource:0}: Error finding container 42718be0ac7d8814430ff99ee09994adad89e21dca61dbf3943ada8abcb293ee: Status 404 returned error can't find the container with id 42718be0ac7d8814430ff99ee09994adad89e21dca61dbf3943ada8abcb293ee Apr 17 17:31:20.892588 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:20.892499 2565 generic.go:358] "Generic (PLEG): container finished" podID="24c32525-46b8-4c66-9357-a8b7230a5f20" containerID="4e9b14307bb9d51ca097f5c3997a1230893d3056649bb625406c115552f9104b" exitCode=0 Apr 17 17:31:20.893055 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:20.892588 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" event={"ID":"24c32525-46b8-4c66-9357-a8b7230a5f20","Type":"ContainerDied","Data":"4e9b14307bb9d51ca097f5c3997a1230893d3056649bb625406c115552f9104b"} Apr 17 17:31:20.893055 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:20.892627 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" event={"ID":"24c32525-46b8-4c66-9357-a8b7230a5f20","Type":"ContainerStarted","Data":"42718be0ac7d8814430ff99ee09994adad89e21dca61dbf3943ada8abcb293ee"} Apr 17 17:31:20.894117 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:20.893982 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" event={"ID":"34cb13f4-0a7d-4a07-9b4c-2e858b17357c","Type":"ContainerStarted","Data":"ef12408f99cf633cb954f28aed7e145f1987c81fd58cb79c71ea4ad49faa94c1"} Apr 17 17:31:20.894117 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:20.894111 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:20.966388 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:20.966333 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" podStartSLOduration=1.559584805 podStartE2EDuration="2.966316401s" podCreationTimestamp="2026-04-17 17:31:18 +0000 UTC" firstStartedPulling="2026-04-17 17:31:19.250995544 +0000 UTC m=+428.583613168" lastFinishedPulling="2026-04-17 17:31:20.657727126 +0000 UTC m=+429.990344764" observedRunningTime="2026-04-17 17:31:20.964699758 +0000 UTC m=+430.297317405" watchObservedRunningTime="2026-04-17 17:31:20.966316401 +0000 UTC m=+430.298934047" Apr 17 17:31:21.899161 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:21.899121 2565 generic.go:358] "Generic (PLEG): container finished" podID="24c32525-46b8-4c66-9357-a8b7230a5f20" containerID="4b58d8f8fc1473b7ec9d11e6470e9dd71970bda83447a838a7f1454862e726f5" exitCode=0 Apr 17 17:31:21.899649 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:21.899209 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" event={"ID":"24c32525-46b8-4c66-9357-a8b7230a5f20","Type":"ContainerDied","Data":"4b58d8f8fc1473b7ec9d11e6470e9dd71970bda83447a838a7f1454862e726f5"} Apr 17 17:31:22.904976 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:22.904937 2565 generic.go:358] "Generic (PLEG): container finished" podID="24c32525-46b8-4c66-9357-a8b7230a5f20" containerID="024d9f2442c213b22f36efefbf4a8d55b00d44360e39610bc91de9273c93de18" exitCode=0 Apr 17 17:31:22.905458 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:22.905025 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" event={"ID":"24c32525-46b8-4c66-9357-a8b7230a5f20","Type":"ContainerDied","Data":"024d9f2442c213b22f36efefbf4a8d55b00d44360e39610bc91de9273c93de18"} Apr 17 17:31:24.036811 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.036787 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:24.074853 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.074808 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24c32525-46b8-4c66-9357-a8b7230a5f20-bundle\") pod \"24c32525-46b8-4c66-9357-a8b7230a5f20\" (UID: \"24c32525-46b8-4c66-9357-a8b7230a5f20\") " Apr 17 17:31:24.075013 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.074962 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvzq9\" (UniqueName: \"kubernetes.io/projected/24c32525-46b8-4c66-9357-a8b7230a5f20-kube-api-access-vvzq9\") pod \"24c32525-46b8-4c66-9357-a8b7230a5f20\" (UID: \"24c32525-46b8-4c66-9357-a8b7230a5f20\") " Apr 17 17:31:24.075064 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.075044 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24c32525-46b8-4c66-9357-a8b7230a5f20-util\") pod \"24c32525-46b8-4c66-9357-a8b7230a5f20\" (UID: \"24c32525-46b8-4c66-9357-a8b7230a5f20\") " Apr 17 17:31:24.075729 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.075706 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24c32525-46b8-4c66-9357-a8b7230a5f20-bundle" (OuterVolumeSpecName: "bundle") pod "24c32525-46b8-4c66-9357-a8b7230a5f20" (UID: "24c32525-46b8-4c66-9357-a8b7230a5f20"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:24.077044 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.077007 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24c32525-46b8-4c66-9357-a8b7230a5f20-kube-api-access-vvzq9" (OuterVolumeSpecName: "kube-api-access-vvzq9") pod "24c32525-46b8-4c66-9357-a8b7230a5f20" (UID: "24c32525-46b8-4c66-9357-a8b7230a5f20"). InnerVolumeSpecName "kube-api-access-vvzq9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:24.080666 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.080635 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24c32525-46b8-4c66-9357-a8b7230a5f20-util" (OuterVolumeSpecName: "util") pod "24c32525-46b8-4c66-9357-a8b7230a5f20" (UID: "24c32525-46b8-4c66-9357-a8b7230a5f20"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:24.176331 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.176223 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vvzq9\" (UniqueName: \"kubernetes.io/projected/24c32525-46b8-4c66-9357-a8b7230a5f20-kube-api-access-vvzq9\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:24.176331 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.176258 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24c32525-46b8-4c66-9357-a8b7230a5f20-util\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:24.176331 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.176268 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24c32525-46b8-4c66-9357-a8b7230a5f20-bundle\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:24.915420 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.915391 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" Apr 17 17:31:24.915622 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.915386 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2zt4gj" event={"ID":"24c32525-46b8-4c66-9357-a8b7230a5f20","Type":"ContainerDied","Data":"42718be0ac7d8814430ff99ee09994adad89e21dca61dbf3943ada8abcb293ee"} Apr 17 17:31:24.915622 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:24.915500 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42718be0ac7d8814430ff99ee09994adad89e21dca61dbf3943ada8abcb293ee" Apr 17 17:31:31.901993 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:31.901960 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6dd684f56d-x5nwj" Apr 17 17:31:51.717214 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.717174 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt"] Apr 17 17:31:51.717711 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.717691 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24c32525-46b8-4c66-9357-a8b7230a5f20" containerName="extract" Apr 17 17:31:51.717789 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.717715 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="24c32525-46b8-4c66-9357-a8b7230a5f20" containerName="extract" Apr 17 17:31:51.717789 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.717729 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24c32525-46b8-4c66-9357-a8b7230a5f20" containerName="pull" Apr 17 17:31:51.717789 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.717739 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="24c32525-46b8-4c66-9357-a8b7230a5f20" containerName="pull" Apr 17 17:31:51.717789 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.717769 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24c32525-46b8-4c66-9357-a8b7230a5f20" containerName="util" Apr 17 17:31:51.717789 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.717779 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="24c32525-46b8-4c66-9357-a8b7230a5f20" containerName="util" Apr 17 17:31:51.718074 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.717914 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="24c32525-46b8-4c66-9357-a8b7230a5f20" containerName="extract" Apr 17 17:31:51.722139 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.722117 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:51.724944 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.724921 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:31:51.725988 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.725971 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:31:51.726082 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.725973 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-l6dq5\"" Apr 17 17:31:51.728995 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.728975 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt"] Apr 17 17:31:51.816311 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.816269 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2"] Apr 17 17:31:51.819742 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.819719 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:51.822211 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.822189 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/779eb2ca-4e40-435c-9f03-6a8a80b19007-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt\" (UID: \"779eb2ca-4e40-435c-9f03-6a8a80b19007\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:51.822306 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.822219 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/779eb2ca-4e40-435c-9f03-6a8a80b19007-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt\" (UID: \"779eb2ca-4e40-435c-9f03-6a8a80b19007\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:51.822364 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.822331 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xddrp\" (UniqueName: \"kubernetes.io/projected/779eb2ca-4e40-435c-9f03-6a8a80b19007-kube-api-access-xddrp\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt\" (UID: \"779eb2ca-4e40-435c-9f03-6a8a80b19007\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:51.829292 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.829260 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2"] Apr 17 17:31:51.923147 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.923092 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/779eb2ca-4e40-435c-9f03-6a8a80b19007-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt\" (UID: \"779eb2ca-4e40-435c-9f03-6a8a80b19007\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:51.923147 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.923143 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/779eb2ca-4e40-435c-9f03-6a8a80b19007-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt\" (UID: \"779eb2ca-4e40-435c-9f03-6a8a80b19007\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:51.923472 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.923198 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2\" (UID: \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:51.923472 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.923277 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2\" (UID: \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:51.923472 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.923366 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xddrp\" (UniqueName: \"kubernetes.io/projected/779eb2ca-4e40-435c-9f03-6a8a80b19007-kube-api-access-xddrp\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt\" (UID: \"779eb2ca-4e40-435c-9f03-6a8a80b19007\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:51.923472 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.923431 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkntn\" (UniqueName: \"kubernetes.io/projected/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-kube-api-access-fkntn\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2\" (UID: \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:51.923704 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.923536 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/779eb2ca-4e40-435c-9f03-6a8a80b19007-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt\" (UID: \"779eb2ca-4e40-435c-9f03-6a8a80b19007\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:51.923704 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.923629 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/779eb2ca-4e40-435c-9f03-6a8a80b19007-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt\" (UID: \"779eb2ca-4e40-435c-9f03-6a8a80b19007\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:51.925718 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.925695 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4"] Apr 17 17:31:51.929097 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.929083 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:31:51.933709 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.933685 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xddrp\" (UniqueName: \"kubernetes.io/projected/779eb2ca-4e40-435c-9f03-6a8a80b19007-kube-api-access-xddrp\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt\" (UID: \"779eb2ca-4e40-435c-9f03-6a8a80b19007\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:51.938423 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:51.938399 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4"] Apr 17 17:31:52.020471 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.020388 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt"] Apr 17 17:31:52.024063 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.024040 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:52.024308 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.024286 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2\" (UID: \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:52.024377 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.024335 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkntn\" (UniqueName: \"kubernetes.io/projected/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-kube-api-access-fkntn\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2\" (UID: \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:52.024444 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.024378 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8a36306-fd09-4739-9a23-f76b249700d5-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4\" (UID: \"b8a36306-fd09-4739-9a23-f76b249700d5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:31:52.024444 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.024400 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8a36306-fd09-4739-9a23-f76b249700d5-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4\" (UID: \"b8a36306-fd09-4739-9a23-f76b249700d5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:31:52.024444 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.024423 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9dfk\" (UniqueName: \"kubernetes.io/projected/b8a36306-fd09-4739-9a23-f76b249700d5-kube-api-access-t9dfk\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4\" (UID: \"b8a36306-fd09-4739-9a23-f76b249700d5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:31:52.024590 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.024475 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2\" (UID: \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:52.024634 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.024605 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2\" (UID: \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:52.024751 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.024736 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2\" (UID: \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:52.031701 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.031682 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:52.031816 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.031769 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt"] Apr 17 17:31:52.034346 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.034325 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkntn\" (UniqueName: \"kubernetes.io/projected/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-kube-api-access-fkntn\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2\" (UID: \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:52.125807 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.125766 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7v9x\" (UniqueName: \"kubernetes.io/projected/75902eeb-c300-4a8d-b84b-fc8142bfb723-kube-api-access-x7v9x\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt\" (UID: \"75902eeb-c300-4a8d-b84b-fc8142bfb723\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:52.125956 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.125821 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75902eeb-c300-4a8d-b84b-fc8142bfb723-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt\" (UID: \"75902eeb-c300-4a8d-b84b-fc8142bfb723\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:52.126016 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.125998 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8a36306-fd09-4739-9a23-f76b249700d5-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4\" (UID: \"b8a36306-fd09-4739-9a23-f76b249700d5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:31:52.126068 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.126031 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8a36306-fd09-4739-9a23-f76b249700d5-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4\" (UID: \"b8a36306-fd09-4739-9a23-f76b249700d5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:31:52.126068 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.126057 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9dfk\" (UniqueName: \"kubernetes.io/projected/b8a36306-fd09-4739-9a23-f76b249700d5-kube-api-access-t9dfk\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4\" (UID: \"b8a36306-fd09-4739-9a23-f76b249700d5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:31:52.126180 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.126105 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75902eeb-c300-4a8d-b84b-fc8142bfb723-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt\" (UID: \"75902eeb-c300-4a8d-b84b-fc8142bfb723\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:52.126458 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.126420 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8a36306-fd09-4739-9a23-f76b249700d5-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4\" (UID: \"b8a36306-fd09-4739-9a23-f76b249700d5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:31:52.126585 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.126465 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8a36306-fd09-4739-9a23-f76b249700d5-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4\" (UID: \"b8a36306-fd09-4739-9a23-f76b249700d5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:31:52.129580 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.129560 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:52.139167 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.139139 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9dfk\" (UniqueName: \"kubernetes.io/projected/b8a36306-fd09-4739-9a23-f76b249700d5-kube-api-access-t9dfk\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4\" (UID: \"b8a36306-fd09-4739-9a23-f76b249700d5\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:31:52.162439 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.162410 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt"] Apr 17 17:31:52.164192 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:31:52.164159 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod779eb2ca_4e40_435c_9f03_6a8a80b19007.slice/crio-14908aa4eaf9833f6fa0b33c82b71ee4ddb8ac9cb24ae953327eaa14b59fa326 WatchSource:0}: Error finding container 14908aa4eaf9833f6fa0b33c82b71ee4ddb8ac9cb24ae953327eaa14b59fa326: Status 404 returned error can't find the container with id 14908aa4eaf9833f6fa0b33c82b71ee4ddb8ac9cb24ae953327eaa14b59fa326 Apr 17 17:31:52.227066 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.227033 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7v9x\" (UniqueName: \"kubernetes.io/projected/75902eeb-c300-4a8d-b84b-fc8142bfb723-kube-api-access-x7v9x\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt\" (UID: \"75902eeb-c300-4a8d-b84b-fc8142bfb723\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:52.227187 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.227077 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75902eeb-c300-4a8d-b84b-fc8142bfb723-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt\" (UID: \"75902eeb-c300-4a8d-b84b-fc8142bfb723\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:52.227253 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.227219 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75902eeb-c300-4a8d-b84b-fc8142bfb723-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt\" (UID: \"75902eeb-c300-4a8d-b84b-fc8142bfb723\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:52.227566 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.227541 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75902eeb-c300-4a8d-b84b-fc8142bfb723-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt\" (UID: \"75902eeb-c300-4a8d-b84b-fc8142bfb723\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:52.227644 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.227617 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75902eeb-c300-4a8d-b84b-fc8142bfb723-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt\" (UID: \"75902eeb-c300-4a8d-b84b-fc8142bfb723\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:52.242122 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.242083 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7v9x\" (UniqueName: \"kubernetes.io/projected/75902eeb-c300-4a8d-b84b-fc8142bfb723-kube-api-access-x7v9x\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt\" (UID: \"75902eeb-c300-4a8d-b84b-fc8142bfb723\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:52.249794 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.249758 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:31:52.267397 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.267366 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2"] Apr 17 17:31:52.269316 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:31:52.269284 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2cdabb5_8b38_4c49_ac05_7e9c2954af26.slice/crio-68e8db77d66e5d0ab07091d772d0becad6e61fc477542ed1e256c424c294faae WatchSource:0}: Error finding container 68e8db77d66e5d0ab07091d772d0becad6e61fc477542ed1e256c424c294faae: Status 404 returned error can't find the container with id 68e8db77d66e5d0ab07091d772d0becad6e61fc477542ed1e256c424c294faae Apr 17 17:31:52.335251 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.335223 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:52.385154 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.385115 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4"] Apr 17 17:31:52.387241 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:31:52.387213 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8a36306_fd09_4739_9a23_f76b249700d5.slice/crio-cbcf2d06f5d3b5a2f01dd43bd53740d0101f22e51331c7fefeb074ecaf5ac98e WatchSource:0}: Error finding container cbcf2d06f5d3b5a2f01dd43bd53740d0101f22e51331c7fefeb074ecaf5ac98e: Status 404 returned error can't find the container with id cbcf2d06f5d3b5a2f01dd43bd53740d0101f22e51331c7fefeb074ecaf5ac98e Apr 17 17:31:52.461313 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:31:52.461272 2565 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2cdabb5_8b38_4c49_ac05_7e9c2954af26.slice/crio-5eebafb6127adb1e931a5cfc57aca22228abb80e79ff8b19fd598fd5c6bb3c14.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2cdabb5_8b38_4c49_ac05_7e9c2954af26.slice/crio-conmon-5eebafb6127adb1e931a5cfc57aca22228abb80e79ff8b19fd598fd5c6bb3c14.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:31:52.503379 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:52.503352 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt"] Apr 17 17:31:52.505667 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:31:52.505643 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75902eeb_c300_4a8d_b84b_fc8142bfb723.slice/crio-ab0b26c1f927055003c3b97340472095be6587aba163889d543a58f84fcb1738 WatchSource:0}: Error finding container ab0b26c1f927055003c3b97340472095be6587aba163889d543a58f84fcb1738: Status 404 returned error can't find the container with id ab0b26c1f927055003c3b97340472095be6587aba163889d543a58f84fcb1738 Apr 17 17:31:53.020358 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:53.020314 2565 generic.go:358] "Generic (PLEG): container finished" podID="75902eeb-c300-4a8d-b84b-fc8142bfb723" containerID="f1efd572ec2fe870a29401cfef0bf59f62f3c52b7a91ae5505e0dbe7a7038705" exitCode=0 Apr 17 17:31:53.020765 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:53.020385 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" event={"ID":"75902eeb-c300-4a8d-b84b-fc8142bfb723","Type":"ContainerDied","Data":"f1efd572ec2fe870a29401cfef0bf59f62f3c52b7a91ae5505e0dbe7a7038705"} Apr 17 17:31:53.020765 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:53.020420 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" event={"ID":"75902eeb-c300-4a8d-b84b-fc8142bfb723","Type":"ContainerStarted","Data":"ab0b26c1f927055003c3b97340472095be6587aba163889d543a58f84fcb1738"} Apr 17 17:31:53.021820 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:53.021722 2565 generic.go:358] "Generic (PLEG): container finished" podID="b8a36306-fd09-4739-9a23-f76b249700d5" containerID="331b93f08be946f39f466fec7b54ffd8fe4ab986af161fbc244e1e9bc2efc286" exitCode=0 Apr 17 17:31:53.021820 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:53.021805 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" event={"ID":"b8a36306-fd09-4739-9a23-f76b249700d5","Type":"ContainerDied","Data":"331b93f08be946f39f466fec7b54ffd8fe4ab986af161fbc244e1e9bc2efc286"} Apr 17 17:31:53.021985 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:53.021857 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" event={"ID":"b8a36306-fd09-4739-9a23-f76b249700d5","Type":"ContainerStarted","Data":"cbcf2d06f5d3b5a2f01dd43bd53740d0101f22e51331c7fefeb074ecaf5ac98e"} Apr 17 17:31:53.023257 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:53.023234 2565 generic.go:358] "Generic (PLEG): container finished" podID="779eb2ca-4e40-435c-9f03-6a8a80b19007" containerID="c4333bea751de4d2be3d23d4eac8b0a56338433e2ea1e1c0b4dd96936079b6af" exitCode=0 Apr 17 17:31:53.023324 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:53.023262 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" event={"ID":"779eb2ca-4e40-435c-9f03-6a8a80b19007","Type":"ContainerDied","Data":"c4333bea751de4d2be3d23d4eac8b0a56338433e2ea1e1c0b4dd96936079b6af"} Apr 17 17:31:53.023324 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:53.023288 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" event={"ID":"779eb2ca-4e40-435c-9f03-6a8a80b19007","Type":"ContainerStarted","Data":"14908aa4eaf9833f6fa0b33c82b71ee4ddb8ac9cb24ae953327eaa14b59fa326"} Apr 17 17:31:53.024771 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:53.024749 2565 generic.go:358] "Generic (PLEG): container finished" podID="f2cdabb5-8b38-4c49-ac05-7e9c2954af26" containerID="5eebafb6127adb1e931a5cfc57aca22228abb80e79ff8b19fd598fd5c6bb3c14" exitCode=0 Apr 17 17:31:53.024870 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:53.024783 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" event={"ID":"f2cdabb5-8b38-4c49-ac05-7e9c2954af26","Type":"ContainerDied","Data":"5eebafb6127adb1e931a5cfc57aca22228abb80e79ff8b19fd598fd5c6bb3c14"} Apr 17 17:31:53.024870 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:53.024807 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" event={"ID":"f2cdabb5-8b38-4c49-ac05-7e9c2954af26","Type":"ContainerStarted","Data":"68e8db77d66e5d0ab07091d772d0becad6e61fc477542ed1e256c424c294faae"} Apr 17 17:31:54.030248 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:54.030216 2565 generic.go:358] "Generic (PLEG): container finished" podID="f2cdabb5-8b38-4c49-ac05-7e9c2954af26" containerID="99b0e44101b4e94abafa695f8def78abd6c1249785065feb69cbcde624653322" exitCode=0 Apr 17 17:31:54.030662 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:54.030302 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" event={"ID":"f2cdabb5-8b38-4c49-ac05-7e9c2954af26","Type":"ContainerDied","Data":"99b0e44101b4e94abafa695f8def78abd6c1249785065feb69cbcde624653322"} Apr 17 17:31:54.032054 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:54.031931 2565 generic.go:358] "Generic (PLEG): container finished" podID="779eb2ca-4e40-435c-9f03-6a8a80b19007" containerID="73460facde4641d4624b1cb274dc99be13d357d27d2a56616bc7dcb45e415733" exitCode=0 Apr 17 17:31:54.032054 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:54.031978 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" event={"ID":"779eb2ca-4e40-435c-9f03-6a8a80b19007","Type":"ContainerDied","Data":"73460facde4641d4624b1cb274dc99be13d357d27d2a56616bc7dcb45e415733"} Apr 17 17:31:55.036952 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:55.036917 2565 generic.go:358] "Generic (PLEG): container finished" podID="b8a36306-fd09-4739-9a23-f76b249700d5" containerID="bb960e173310216072144cf3d1567f6461539780494d618cc15d2a8429f08e5c" exitCode=0 Apr 17 17:31:55.037349 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:55.037009 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" event={"ID":"b8a36306-fd09-4739-9a23-f76b249700d5","Type":"ContainerDied","Data":"bb960e173310216072144cf3d1567f6461539780494d618cc15d2a8429f08e5c"} Apr 17 17:31:55.039127 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:55.039107 2565 generic.go:358] "Generic (PLEG): container finished" podID="779eb2ca-4e40-435c-9f03-6a8a80b19007" containerID="1ea0d1595d74b90864a925fefe19d93282cbdb5e0771cf258c9fd473da7ae20f" exitCode=0 Apr 17 17:31:55.039223 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:55.039189 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" event={"ID":"779eb2ca-4e40-435c-9f03-6a8a80b19007","Type":"ContainerDied","Data":"1ea0d1595d74b90864a925fefe19d93282cbdb5e0771cf258c9fd473da7ae20f"} Apr 17 17:31:55.040914 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:55.040886 2565 generic.go:358] "Generic (PLEG): container finished" podID="f2cdabb5-8b38-4c49-ac05-7e9c2954af26" containerID="cf44d5a560b698834bda0ec726e3cbe09057ec9fe32150fa860547533de54a5a" exitCode=0 Apr 17 17:31:55.041026 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:55.040911 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" event={"ID":"f2cdabb5-8b38-4c49-ac05-7e9c2954af26","Type":"ContainerDied","Data":"cf44d5a560b698834bda0ec726e3cbe09057ec9fe32150fa860547533de54a5a"} Apr 17 17:31:55.042676 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:55.042655 2565 generic.go:358] "Generic (PLEG): container finished" podID="75902eeb-c300-4a8d-b84b-fc8142bfb723" containerID="bb3f2df196442f86f7dc50f0ee16d823f0de01b6ac13c7020c9c3a93969d12cd" exitCode=0 Apr 17 17:31:55.042765 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:55.042688 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" event={"ID":"75902eeb-c300-4a8d-b84b-fc8142bfb723","Type":"ContainerDied","Data":"bb3f2df196442f86f7dc50f0ee16d823f0de01b6ac13c7020c9c3a93969d12cd"} Apr 17 17:31:56.048807 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.048772 2565 generic.go:358] "Generic (PLEG): container finished" podID="75902eeb-c300-4a8d-b84b-fc8142bfb723" containerID="fa71712581094a8481773fae3ba72625be6c827b8cba31eb552fb6d38a44abc3" exitCode=0 Apr 17 17:31:56.049209 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.048877 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" event={"ID":"75902eeb-c300-4a8d-b84b-fc8142bfb723","Type":"ContainerDied","Data":"fa71712581094a8481773fae3ba72625be6c827b8cba31eb552fb6d38a44abc3"} Apr 17 17:31:56.050652 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.050623 2565 generic.go:358] "Generic (PLEG): container finished" podID="b8a36306-fd09-4739-9a23-f76b249700d5" containerID="c28824e9b483b769caa18706dc921caf03a62ba56462352f95867214cdddad95" exitCode=0 Apr 17 17:31:56.050777 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.050698 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" event={"ID":"b8a36306-fd09-4739-9a23-f76b249700d5","Type":"ContainerDied","Data":"c28824e9b483b769caa18706dc921caf03a62ba56462352f95867214cdddad95"} Apr 17 17:31:56.182730 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.182704 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:56.217694 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.217663 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:56.262410 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.262382 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/779eb2ca-4e40-435c-9f03-6a8a80b19007-bundle\") pod \"779eb2ca-4e40-435c-9f03-6a8a80b19007\" (UID: \"779eb2ca-4e40-435c-9f03-6a8a80b19007\") " Apr 17 17:31:56.262574 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.262466 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/779eb2ca-4e40-435c-9f03-6a8a80b19007-util\") pod \"779eb2ca-4e40-435c-9f03-6a8a80b19007\" (UID: \"779eb2ca-4e40-435c-9f03-6a8a80b19007\") " Apr 17 17:31:56.262574 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.262503 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xddrp\" (UniqueName: \"kubernetes.io/projected/779eb2ca-4e40-435c-9f03-6a8a80b19007-kube-api-access-xddrp\") pod \"779eb2ca-4e40-435c-9f03-6a8a80b19007\" (UID: \"779eb2ca-4e40-435c-9f03-6a8a80b19007\") " Apr 17 17:31:56.263169 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.263142 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/779eb2ca-4e40-435c-9f03-6a8a80b19007-bundle" (OuterVolumeSpecName: "bundle") pod "779eb2ca-4e40-435c-9f03-6a8a80b19007" (UID: "779eb2ca-4e40-435c-9f03-6a8a80b19007"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:56.264725 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.264701 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/779eb2ca-4e40-435c-9f03-6a8a80b19007-kube-api-access-xddrp" (OuterVolumeSpecName: "kube-api-access-xddrp") pod "779eb2ca-4e40-435c-9f03-6a8a80b19007" (UID: "779eb2ca-4e40-435c-9f03-6a8a80b19007"). InnerVolumeSpecName "kube-api-access-xddrp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:56.268256 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.268234 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/779eb2ca-4e40-435c-9f03-6a8a80b19007-util" (OuterVolumeSpecName: "util") pod "779eb2ca-4e40-435c-9f03-6a8a80b19007" (UID: "779eb2ca-4e40-435c-9f03-6a8a80b19007"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:56.363671 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.363624 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkntn\" (UniqueName: \"kubernetes.io/projected/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-kube-api-access-fkntn\") pod \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\" (UID: \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\") " Apr 17 17:31:56.363671 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.363681 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-util\") pod \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\" (UID: \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\") " Apr 17 17:31:56.363981 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.363723 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-bundle\") pod \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\" (UID: \"f2cdabb5-8b38-4c49-ac05-7e9c2954af26\") " Apr 17 17:31:56.363981 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.363917 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/779eb2ca-4e40-435c-9f03-6a8a80b19007-util\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:56.363981 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.363929 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xddrp\" (UniqueName: \"kubernetes.io/projected/779eb2ca-4e40-435c-9f03-6a8a80b19007-kube-api-access-xddrp\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:56.363981 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.363939 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/779eb2ca-4e40-435c-9f03-6a8a80b19007-bundle\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:56.364293 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.364266 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-bundle" (OuterVolumeSpecName: "bundle") pod "f2cdabb5-8b38-4c49-ac05-7e9c2954af26" (UID: "f2cdabb5-8b38-4c49-ac05-7e9c2954af26"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:56.365748 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.365725 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-kube-api-access-fkntn" (OuterVolumeSpecName: "kube-api-access-fkntn") pod "f2cdabb5-8b38-4c49-ac05-7e9c2954af26" (UID: "f2cdabb5-8b38-4c49-ac05-7e9c2954af26"). InnerVolumeSpecName "kube-api-access-fkntn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:56.368505 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.368460 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-util" (OuterVolumeSpecName: "util") pod "f2cdabb5-8b38-4c49-ac05-7e9c2954af26" (UID: "f2cdabb5-8b38-4c49-ac05-7e9c2954af26"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:56.464901 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.464866 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fkntn\" (UniqueName: \"kubernetes.io/projected/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-kube-api-access-fkntn\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:56.464901 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.464896 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-util\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:56.464901 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:56.464909 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f2cdabb5-8b38-4c49-ac05-7e9c2954af26-bundle\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:57.058534 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.058502 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" Apr 17 17:31:57.058982 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.058498 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5037s9xt" event={"ID":"779eb2ca-4e40-435c-9f03-6a8a80b19007","Type":"ContainerDied","Data":"14908aa4eaf9833f6fa0b33c82b71ee4ddb8ac9cb24ae953327eaa14b59fa326"} Apr 17 17:31:57.058982 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.058619 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14908aa4eaf9833f6fa0b33c82b71ee4ddb8ac9cb24ae953327eaa14b59fa326" Apr 17 17:31:57.060138 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.060115 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" event={"ID":"f2cdabb5-8b38-4c49-ac05-7e9c2954af26","Type":"ContainerDied","Data":"68e8db77d66e5d0ab07091d772d0becad6e61fc477542ed1e256c424c294faae"} Apr 17 17:31:57.060138 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.060139 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e8db77d66e5d0ab07091d772d0becad6e61fc477542ed1e256c424c294faae" Apr 17 17:31:57.060862 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.060428 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767b2zrk2" Apr 17 17:31:57.214789 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.214768 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:31:57.217744 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.217726 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:57.371494 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.371390 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7v9x\" (UniqueName: \"kubernetes.io/projected/75902eeb-c300-4a8d-b84b-fc8142bfb723-kube-api-access-x7v9x\") pod \"75902eeb-c300-4a8d-b84b-fc8142bfb723\" (UID: \"75902eeb-c300-4a8d-b84b-fc8142bfb723\") " Apr 17 17:31:57.371494 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.371436 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75902eeb-c300-4a8d-b84b-fc8142bfb723-util\") pod \"75902eeb-c300-4a8d-b84b-fc8142bfb723\" (UID: \"75902eeb-c300-4a8d-b84b-fc8142bfb723\") " Apr 17 17:31:57.371494 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.371480 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8a36306-fd09-4739-9a23-f76b249700d5-bundle\") pod \"b8a36306-fd09-4739-9a23-f76b249700d5\" (UID: \"b8a36306-fd09-4739-9a23-f76b249700d5\") " Apr 17 17:31:57.371494 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.371496 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9dfk\" (UniqueName: \"kubernetes.io/projected/b8a36306-fd09-4739-9a23-f76b249700d5-kube-api-access-t9dfk\") pod \"b8a36306-fd09-4739-9a23-f76b249700d5\" (UID: \"b8a36306-fd09-4739-9a23-f76b249700d5\") " Apr 17 17:31:57.371817 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.371522 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8a36306-fd09-4739-9a23-f76b249700d5-util\") pod \"b8a36306-fd09-4739-9a23-f76b249700d5\" (UID: \"b8a36306-fd09-4739-9a23-f76b249700d5\") " Apr 17 17:31:57.371817 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.371565 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75902eeb-c300-4a8d-b84b-fc8142bfb723-bundle\") pod \"75902eeb-c300-4a8d-b84b-fc8142bfb723\" (UID: \"75902eeb-c300-4a8d-b84b-fc8142bfb723\") " Apr 17 17:31:57.372145 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.372101 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a36306-fd09-4739-9a23-f76b249700d5-bundle" (OuterVolumeSpecName: "bundle") pod "b8a36306-fd09-4739-9a23-f76b249700d5" (UID: "b8a36306-fd09-4739-9a23-f76b249700d5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:57.372301 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.372211 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75902eeb-c300-4a8d-b84b-fc8142bfb723-bundle" (OuterVolumeSpecName: "bundle") pod "75902eeb-c300-4a8d-b84b-fc8142bfb723" (UID: "75902eeb-c300-4a8d-b84b-fc8142bfb723"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:57.373632 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.373609 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a36306-fd09-4739-9a23-f76b249700d5-kube-api-access-t9dfk" (OuterVolumeSpecName: "kube-api-access-t9dfk") pod "b8a36306-fd09-4739-9a23-f76b249700d5" (UID: "b8a36306-fd09-4739-9a23-f76b249700d5"). InnerVolumeSpecName "kube-api-access-t9dfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:57.374009 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.373994 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75902eeb-c300-4a8d-b84b-fc8142bfb723-kube-api-access-x7v9x" (OuterVolumeSpecName: "kube-api-access-x7v9x") pod "75902eeb-c300-4a8d-b84b-fc8142bfb723" (UID: "75902eeb-c300-4a8d-b84b-fc8142bfb723"). InnerVolumeSpecName "kube-api-access-x7v9x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:57.377618 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.377581 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75902eeb-c300-4a8d-b84b-fc8142bfb723-util" (OuterVolumeSpecName: "util") pod "75902eeb-c300-4a8d-b84b-fc8142bfb723" (UID: "75902eeb-c300-4a8d-b84b-fc8142bfb723"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:57.377708 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.377687 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a36306-fd09-4739-9a23-f76b249700d5-util" (OuterVolumeSpecName: "util") pod "b8a36306-fd09-4739-9a23-f76b249700d5" (UID: "b8a36306-fd09-4739-9a23-f76b249700d5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:57.472637 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.472590 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x7v9x\" (UniqueName: \"kubernetes.io/projected/75902eeb-c300-4a8d-b84b-fc8142bfb723-kube-api-access-x7v9x\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:57.472637 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.472630 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75902eeb-c300-4a8d-b84b-fc8142bfb723-util\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:57.472637 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.472641 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8a36306-fd09-4739-9a23-f76b249700d5-bundle\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:57.472637 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.472649 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t9dfk\" (UniqueName: \"kubernetes.io/projected/b8a36306-fd09-4739-9a23-f76b249700d5-kube-api-access-t9dfk\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:57.472921 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.472658 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8a36306-fd09-4739-9a23-f76b249700d5-util\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:57.472921 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:57.472667 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75902eeb-c300-4a8d-b84b-fc8142bfb723-bundle\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:31:58.065748 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:58.065710 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" event={"ID":"75902eeb-c300-4a8d-b84b-fc8142bfb723","Type":"ContainerDied","Data":"ab0b26c1f927055003c3b97340472095be6587aba163889d543a58f84fcb1738"} Apr 17 17:31:58.065748 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:58.065746 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab0b26c1f927055003c3b97340472095be6587aba163889d543a58f84fcb1738" Apr 17 17:31:58.066242 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:58.065763 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30px5pt" Apr 17 17:31:58.067424 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:58.067404 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" event={"ID":"b8a36306-fd09-4739-9a23-f76b249700d5","Type":"ContainerDied","Data":"cbcf2d06f5d3b5a2f01dd43bd53740d0101f22e51331c7fefeb074ecaf5ac98e"} Apr 17 17:31:58.067494 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:58.067427 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbcf2d06f5d3b5a2f01dd43bd53740d0101f22e51331c7fefeb074ecaf5ac98e" Apr 17 17:31:58.067527 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:31:58.067499 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88tdpb4" Apr 17 17:32:09.447617 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:09.447586 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f6c86f79d-qxg9m"] Apr 17 17:32:17.714737 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.714703 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7"] Apr 17 17:32:17.715146 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715109 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75902eeb-c300-4a8d-b84b-fc8142bfb723" containerName="util" Apr 17 17:32:17.715146 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715122 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="75902eeb-c300-4a8d-b84b-fc8142bfb723" containerName="util" Apr 17 17:32:17.715146 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715131 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2cdabb5-8b38-4c49-ac05-7e9c2954af26" containerName="extract" Apr 17 17:32:17.715146 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715136 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cdabb5-8b38-4c49-ac05-7e9c2954af26" containerName="extract" Apr 17 17:32:17.715146 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715145 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75902eeb-c300-4a8d-b84b-fc8142bfb723" containerName="extract" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715150 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="75902eeb-c300-4a8d-b84b-fc8142bfb723" containerName="extract" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715160 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="779eb2ca-4e40-435c-9f03-6a8a80b19007" containerName="pull" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715165 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="779eb2ca-4e40-435c-9f03-6a8a80b19007" containerName="pull" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715175 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2cdabb5-8b38-4c49-ac05-7e9c2954af26" containerName="util" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715181 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cdabb5-8b38-4c49-ac05-7e9c2954af26" containerName="util" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715192 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8a36306-fd09-4739-9a23-f76b249700d5" containerName="util" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715197 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a36306-fd09-4739-9a23-f76b249700d5" containerName="util" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715205 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8a36306-fd09-4739-9a23-f76b249700d5" containerName="pull" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715210 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a36306-fd09-4739-9a23-f76b249700d5" containerName="pull" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715219 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="779eb2ca-4e40-435c-9f03-6a8a80b19007" containerName="util" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715225 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="779eb2ca-4e40-435c-9f03-6a8a80b19007" containerName="util" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715235 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8a36306-fd09-4739-9a23-f76b249700d5" containerName="extract" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715240 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a36306-fd09-4739-9a23-f76b249700d5" containerName="extract" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715250 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2cdabb5-8b38-4c49-ac05-7e9c2954af26" containerName="pull" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715255 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cdabb5-8b38-4c49-ac05-7e9c2954af26" containerName="pull" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715261 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="779eb2ca-4e40-435c-9f03-6a8a80b19007" containerName="extract" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715267 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="779eb2ca-4e40-435c-9f03-6a8a80b19007" containerName="extract" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715273 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75902eeb-c300-4a8d-b84b-fc8142bfb723" containerName="pull" Apr 17 17:32:17.715312 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715277 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="75902eeb-c300-4a8d-b84b-fc8142bfb723" containerName="pull" Apr 17 17:32:17.715865 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715367 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="779eb2ca-4e40-435c-9f03-6a8a80b19007" containerName="extract" Apr 17 17:32:17.715865 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715378 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="75902eeb-c300-4a8d-b84b-fc8142bfb723" containerName="extract" Apr 17 17:32:17.715865 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715387 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8a36306-fd09-4739-9a23-f76b249700d5" containerName="extract" Apr 17 17:32:17.715865 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.715396 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2cdabb5-8b38-4c49-ac05-7e9c2954af26" containerName="extract" Apr 17 17:32:17.718255 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.718238 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" Apr 17 17:32:17.720617 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.720596 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 17:32:17.723196 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.723175 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:32:17.724143 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.724123 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:32:17.724240 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.724176 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 17:32:17.724240 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.724227 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-bblll\"" Apr 17 17:32:17.735747 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.735721 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7"] Apr 17 17:32:17.750470 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.750444 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cbcc1b92-0ae7-45c5-9394-e10768ea865e-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-r2tw7\" (UID: \"cbcc1b92-0ae7-45c5-9394-e10768ea865e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" Apr 17 17:32:17.750570 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.750476 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhj2j\" (UniqueName: \"kubernetes.io/projected/cbcc1b92-0ae7-45c5-9394-e10768ea865e-kube-api-access-dhj2j\") pod \"kuadrant-console-plugin-6c886788f8-r2tw7\" (UID: \"cbcc1b92-0ae7-45c5-9394-e10768ea865e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" Apr 17 17:32:17.750570 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.750519 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcc1b92-0ae7-45c5-9394-e10768ea865e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-r2tw7\" (UID: \"cbcc1b92-0ae7-45c5-9394-e10768ea865e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" Apr 17 17:32:17.851211 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.851174 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cbcc1b92-0ae7-45c5-9394-e10768ea865e-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-r2tw7\" (UID: \"cbcc1b92-0ae7-45c5-9394-e10768ea865e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" Apr 17 17:32:17.851211 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.851212 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhj2j\" (UniqueName: \"kubernetes.io/projected/cbcc1b92-0ae7-45c5-9394-e10768ea865e-kube-api-access-dhj2j\") pod \"kuadrant-console-plugin-6c886788f8-r2tw7\" (UID: \"cbcc1b92-0ae7-45c5-9394-e10768ea865e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" Apr 17 17:32:17.851462 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.851255 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcc1b92-0ae7-45c5-9394-e10768ea865e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-r2tw7\" (UID: \"cbcc1b92-0ae7-45c5-9394-e10768ea865e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" Apr 17 17:32:17.851462 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:32:17.851337 2565 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 17 17:32:17.851462 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:32:17.851401 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbcc1b92-0ae7-45c5-9394-e10768ea865e-plugin-serving-cert podName:cbcc1b92-0ae7-45c5-9394-e10768ea865e nodeName:}" failed. No retries permitted until 2026-04-17 17:32:18.351384421 +0000 UTC m=+487.684002045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/cbcc1b92-0ae7-45c5-9394-e10768ea865e-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-r2tw7" (UID: "cbcc1b92-0ae7-45c5-9394-e10768ea865e") : secret "plugin-serving-cert" not found Apr 17 17:32:17.851900 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.851876 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cbcc1b92-0ae7-45c5-9394-e10768ea865e-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-r2tw7\" (UID: \"cbcc1b92-0ae7-45c5-9394-e10768ea865e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" Apr 17 17:32:17.883467 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:17.883434 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhj2j\" (UniqueName: \"kubernetes.io/projected/cbcc1b92-0ae7-45c5-9394-e10768ea865e-kube-api-access-dhj2j\") pod \"kuadrant-console-plugin-6c886788f8-r2tw7\" (UID: \"cbcc1b92-0ae7-45c5-9394-e10768ea865e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" Apr 17 17:32:18.355184 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:18.355149 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcc1b92-0ae7-45c5-9394-e10768ea865e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-r2tw7\" (UID: \"cbcc1b92-0ae7-45c5-9394-e10768ea865e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" Apr 17 17:32:18.357518 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:18.357499 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbcc1b92-0ae7-45c5-9394-e10768ea865e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-r2tw7\" (UID: \"cbcc1b92-0ae7-45c5-9394-e10768ea865e\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" Apr 17 17:32:18.627487 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:18.627401 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" Apr 17 17:32:18.753122 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:18.753098 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7"] Apr 17 17:32:18.754997 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:32:18.754967 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbcc1b92_0ae7_45c5_9394_e10768ea865e.slice/crio-5c9106eab7dd949b4e583a2224312d912b143de97788eca79682310014eb09e9 WatchSource:0}: Error finding container 5c9106eab7dd949b4e583a2224312d912b143de97788eca79682310014eb09e9: Status 404 returned error can't find the container with id 5c9106eab7dd949b4e583a2224312d912b143de97788eca79682310014eb09e9 Apr 17 17:32:19.147385 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:19.147340 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" event={"ID":"cbcc1b92-0ae7-45c5-9394-e10768ea865e","Type":"ContainerStarted","Data":"5c9106eab7dd949b4e583a2224312d912b143de97788eca79682310014eb09e9"} Apr 17 17:32:24.169599 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:24.169557 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" event={"ID":"cbcc1b92-0ae7-45c5-9394-e10768ea865e","Type":"ContainerStarted","Data":"884b10f09e83e7bfed2a2126ea0e04dc498c4839407dd90e103b00fea1dd5650"} Apr 17 17:32:24.187297 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:24.187242 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-r2tw7" podStartSLOduration=2.250359739 podStartE2EDuration="7.187228506s" podCreationTimestamp="2026-04-17 17:32:17 +0000 UTC" firstStartedPulling="2026-04-17 17:32:18.756345862 +0000 UTC m=+488.088963486" lastFinishedPulling="2026-04-17 17:32:23.693214625 +0000 UTC m=+493.025832253" observedRunningTime="2026-04-17 17:32:24.185811779 +0000 UTC m=+493.518429436" watchObservedRunningTime="2026-04-17 17:32:24.187228506 +0000 UTC m=+493.519846130" Apr 17 17:32:34.468790 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.468712 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f6c86f79d-qxg9m" podUID="b32c4eb4-4269-4758-9645-d2483932b25e" containerName="console" containerID="cri-o://d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6" gracePeriod=15 Apr 17 17:32:34.712173 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.712151 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f6c86f79d-qxg9m_b32c4eb4-4269-4758-9645-d2483932b25e/console/0.log" Apr 17 17:32:34.712291 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.712210 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:32:34.803273 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.803244 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-oauth-serving-cert\") pod \"b32c4eb4-4269-4758-9645-d2483932b25e\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " Apr 17 17:32:34.803445 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.803289 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b32c4eb4-4269-4758-9645-d2483932b25e-console-serving-cert\") pod \"b32c4eb4-4269-4758-9645-d2483932b25e\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " Apr 17 17:32:34.803445 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.803313 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b32c4eb4-4269-4758-9645-d2483932b25e-console-oauth-config\") pod \"b32c4eb4-4269-4758-9645-d2483932b25e\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " Apr 17 17:32:34.803445 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.803373 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-trusted-ca-bundle\") pod \"b32c4eb4-4269-4758-9645-d2483932b25e\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " Apr 17 17:32:34.803445 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.803395 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvl7t\" (UniqueName: \"kubernetes.io/projected/b32c4eb4-4269-4758-9645-d2483932b25e-kube-api-access-xvl7t\") pod \"b32c4eb4-4269-4758-9645-d2483932b25e\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " Apr 17 17:32:34.803445 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.803415 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-service-ca\") pod \"b32c4eb4-4269-4758-9645-d2483932b25e\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " Apr 17 17:32:34.803713 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.803496 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-console-config\") pod \"b32c4eb4-4269-4758-9645-d2483932b25e\" (UID: \"b32c4eb4-4269-4758-9645-d2483932b25e\") " Apr 17 17:32:34.803823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.803798 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b32c4eb4-4269-4758-9645-d2483932b25e" (UID: "b32c4eb4-4269-4758-9645-d2483932b25e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:34.803913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.803811 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b32c4eb4-4269-4758-9645-d2483932b25e" (UID: "b32c4eb4-4269-4758-9645-d2483932b25e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:34.804064 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.804036 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-service-ca" (OuterVolumeSpecName: "service-ca") pod "b32c4eb4-4269-4758-9645-d2483932b25e" (UID: "b32c4eb4-4269-4758-9645-d2483932b25e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:34.804064 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.804062 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-console-config" (OuterVolumeSpecName: "console-config") pod "b32c4eb4-4269-4758-9645-d2483932b25e" (UID: "b32c4eb4-4269-4758-9645-d2483932b25e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:34.805677 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.805643 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b32c4eb4-4269-4758-9645-d2483932b25e-kube-api-access-xvl7t" (OuterVolumeSpecName: "kube-api-access-xvl7t") pod "b32c4eb4-4269-4758-9645-d2483932b25e" (UID: "b32c4eb4-4269-4758-9645-d2483932b25e"). InnerVolumeSpecName "kube-api-access-xvl7t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:32:34.806051 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.806017 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32c4eb4-4269-4758-9645-d2483932b25e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b32c4eb4-4269-4758-9645-d2483932b25e" (UID: "b32c4eb4-4269-4758-9645-d2483932b25e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:32:34.806051 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.806029 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32c4eb4-4269-4758-9645-d2483932b25e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b32c4eb4-4269-4758-9645-d2483932b25e" (UID: "b32c4eb4-4269-4758-9645-d2483932b25e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:32:34.905156 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.905116 2565 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-oauth-serving-cert\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:32:34.905156 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.905153 2565 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b32c4eb4-4269-4758-9645-d2483932b25e-console-serving-cert\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:32:34.905156 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.905165 2565 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b32c4eb4-4269-4758-9645-d2483932b25e-console-oauth-config\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:32:34.905391 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.905177 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-trusted-ca-bundle\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:32:34.905391 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.905189 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xvl7t\" (UniqueName: \"kubernetes.io/projected/b32c4eb4-4269-4758-9645-d2483932b25e-kube-api-access-xvl7t\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:32:34.905391 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.905202 2565 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-service-ca\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:32:34.905391 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:34.905213 2565 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b32c4eb4-4269-4758-9645-d2483932b25e-console-config\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:32:35.211556 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:35.211532 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f6c86f79d-qxg9m_b32c4eb4-4269-4758-9645-d2483932b25e/console/0.log" Apr 17 17:32:35.211704 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:35.211573 2565 generic.go:358] "Generic (PLEG): container finished" podID="b32c4eb4-4269-4758-9645-d2483932b25e" containerID="d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6" exitCode=2 Apr 17 17:32:35.211704 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:35.211625 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6c86f79d-qxg9m" event={"ID":"b32c4eb4-4269-4758-9645-d2483932b25e","Type":"ContainerDied","Data":"d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6"} Apr 17 17:32:35.211704 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:35.211633 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6c86f79d-qxg9m" Apr 17 17:32:35.211704 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:35.211659 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6c86f79d-qxg9m" event={"ID":"b32c4eb4-4269-4758-9645-d2483932b25e","Type":"ContainerDied","Data":"952827768f7dbc42b6a5190b12f04bd79fb76a7161c5ef663e11194ff4113fc1"} Apr 17 17:32:35.211704 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:35.211677 2565 scope.go:117] "RemoveContainer" containerID="d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6" Apr 17 17:32:35.220796 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:35.220779 2565 scope.go:117] "RemoveContainer" containerID="d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6" Apr 17 17:32:35.221069 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:32:35.221050 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6\": container with ID starting with d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6 not found: ID does not exist" containerID="d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6" Apr 17 17:32:35.221143 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:35.221080 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6"} err="failed to get container status \"d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6\": rpc error: code = NotFound desc = could not find container \"d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6\": container with ID starting with d4d679938663013a8308fb6248ac5049a0a225272dd4bc836a833a5e809553e6 not found: ID does not exist" Apr 17 17:32:35.230534 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:35.230509 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f6c86f79d-qxg9m"] Apr 17 17:32:35.234360 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:35.234341 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f6c86f79d-qxg9m"] Apr 17 17:32:37.194401 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:32:37.194365 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b32c4eb4-4269-4758-9645-d2483932b25e" path="/var/lib/kubelet/pods/b32c4eb4-4269-4758-9645-d2483932b25e/volumes" Apr 17 17:33:02.866673 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:02.866590 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-s6l8t"] Apr 17 17:33:02.867250 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:02.866989 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b32c4eb4-4269-4758-9645-d2483932b25e" containerName="console" Apr 17 17:33:02.867250 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:02.867005 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32c4eb4-4269-4758-9645-d2483932b25e" containerName="console" Apr 17 17:33:02.867250 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:02.867071 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="b32c4eb4-4269-4758-9645-d2483932b25e" containerName="console" Apr 17 17:33:02.868971 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:02.868953 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-s6l8t" Apr 17 17:33:02.871714 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:02.871694 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-69zkb\"" Apr 17 17:33:02.881511 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:02.881482 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-s6l8t"] Apr 17 17:33:03.015727 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.015690 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-2gtjm"] Apr 17 17:33:03.018033 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.018017 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-2gtjm" Apr 17 17:33:03.026331 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.026306 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-2gtjm"] Apr 17 17:33:03.054542 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.054510 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kkcq\" (UniqueName: \"kubernetes.io/projected/43faa243-e2fe-4326-ac6d-81c0587109ac-kube-api-access-2kkcq\") pod \"authorino-674b59b84c-s6l8t\" (UID: \"43faa243-e2fe-4326-ac6d-81c0587109ac\") " pod="kuadrant-system/authorino-674b59b84c-s6l8t" Apr 17 17:33:03.155155 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.155057 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kkcq\" (UniqueName: \"kubernetes.io/projected/43faa243-e2fe-4326-ac6d-81c0587109ac-kube-api-access-2kkcq\") pod \"authorino-674b59b84c-s6l8t\" (UID: \"43faa243-e2fe-4326-ac6d-81c0587109ac\") " pod="kuadrant-system/authorino-674b59b84c-s6l8t" Apr 17 17:33:03.155155 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.155113 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwkwb\" (UniqueName: \"kubernetes.io/projected/1c5ba748-fd44-48a8-aa4c-08372515d92d-kube-api-access-dwkwb\") pod \"authorino-79cbc94b89-2gtjm\" (UID: \"1c5ba748-fd44-48a8-aa4c-08372515d92d\") " pod="kuadrant-system/authorino-79cbc94b89-2gtjm" Apr 17 17:33:03.163648 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.163615 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kkcq\" (UniqueName: \"kubernetes.io/projected/43faa243-e2fe-4326-ac6d-81c0587109ac-kube-api-access-2kkcq\") pod \"authorino-674b59b84c-s6l8t\" (UID: \"43faa243-e2fe-4326-ac6d-81c0587109ac\") " pod="kuadrant-system/authorino-674b59b84c-s6l8t" Apr 17 17:33:03.178488 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.178464 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-s6l8t" Apr 17 17:33:03.256558 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.256509 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwkwb\" (UniqueName: \"kubernetes.io/projected/1c5ba748-fd44-48a8-aa4c-08372515d92d-kube-api-access-dwkwb\") pod \"authorino-79cbc94b89-2gtjm\" (UID: \"1c5ba748-fd44-48a8-aa4c-08372515d92d\") " pod="kuadrant-system/authorino-79cbc94b89-2gtjm" Apr 17 17:33:03.265244 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.265211 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwkwb\" (UniqueName: \"kubernetes.io/projected/1c5ba748-fd44-48a8-aa4c-08372515d92d-kube-api-access-dwkwb\") pod \"authorino-79cbc94b89-2gtjm\" (UID: \"1c5ba748-fd44-48a8-aa4c-08372515d92d\") " pod="kuadrant-system/authorino-79cbc94b89-2gtjm" Apr 17 17:33:03.304296 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.304271 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-s6l8t"] Apr 17 17:33:03.306063 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:33:03.306021 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43faa243_e2fe_4326_ac6d_81c0587109ac.slice/crio-704ec2ddbc96c271ddfcbd61d00dec8e35402115e672656d12bdbb26e5ab2f13 WatchSource:0}: Error finding container 704ec2ddbc96c271ddfcbd61d00dec8e35402115e672656d12bdbb26e5ab2f13: Status 404 returned error can't find the container with id 704ec2ddbc96c271ddfcbd61d00dec8e35402115e672656d12bdbb26e5ab2f13 Apr 17 17:33:03.315805 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.315772 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-s6l8t" event={"ID":"43faa243-e2fe-4326-ac6d-81c0587109ac","Type":"ContainerStarted","Data":"704ec2ddbc96c271ddfcbd61d00dec8e35402115e672656d12bdbb26e5ab2f13"} Apr 17 17:33:03.328025 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.327998 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-2gtjm" Apr 17 17:33:03.454567 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:03.454540 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-2gtjm"] Apr 17 17:33:03.455714 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:33:03.455687 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c5ba748_fd44_48a8_aa4c_08372515d92d.slice/crio-8a556926a0f35b0cb89e73f6ae65603e91166fcf2deb6349a6564001c81f9155 WatchSource:0}: Error finding container 8a556926a0f35b0cb89e73f6ae65603e91166fcf2deb6349a6564001c81f9155: Status 404 returned error can't find the container with id 8a556926a0f35b0cb89e73f6ae65603e91166fcf2deb6349a6564001c81f9155 Apr 17 17:33:04.322576 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:04.322534 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-2gtjm" event={"ID":"1c5ba748-fd44-48a8-aa4c-08372515d92d","Type":"ContainerStarted","Data":"8a556926a0f35b0cb89e73f6ae65603e91166fcf2deb6349a6564001c81f9155"} Apr 17 17:33:06.332684 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:06.332647 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-s6l8t" event={"ID":"43faa243-e2fe-4326-ac6d-81c0587109ac","Type":"ContainerStarted","Data":"21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0"} Apr 17 17:33:06.334023 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:06.334002 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-2gtjm" event={"ID":"1c5ba748-fd44-48a8-aa4c-08372515d92d","Type":"ContainerStarted","Data":"0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca"} Apr 17 17:33:06.348620 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:06.348577 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-s6l8t" podStartSLOduration=1.902008782 podStartE2EDuration="4.348565934s" podCreationTimestamp="2026-04-17 17:33:02 +0000 UTC" firstStartedPulling="2026-04-17 17:33:03.308170068 +0000 UTC m=+532.640787706" lastFinishedPulling="2026-04-17 17:33:05.754727231 +0000 UTC m=+535.087344858" observedRunningTime="2026-04-17 17:33:06.34634061 +0000 UTC m=+535.678958262" watchObservedRunningTime="2026-04-17 17:33:06.348565934 +0000 UTC m=+535.681183579" Apr 17 17:33:06.364071 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:06.364020 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-2gtjm" podStartSLOduration=1.067875929 podStartE2EDuration="3.364005357s" podCreationTimestamp="2026-04-17 17:33:03 +0000 UTC" firstStartedPulling="2026-04-17 17:33:03.456995295 +0000 UTC m=+532.789612918" lastFinishedPulling="2026-04-17 17:33:05.753124719 +0000 UTC m=+535.085742346" observedRunningTime="2026-04-17 17:33:06.362603278 +0000 UTC m=+535.695220924" watchObservedRunningTime="2026-04-17 17:33:06.364005357 +0000 UTC m=+535.696623003" Apr 17 17:33:06.395113 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:06.395072 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-s6l8t"] Apr 17 17:33:08.341403 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:08.341354 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-s6l8t" podUID="43faa243-e2fe-4326-ac6d-81c0587109ac" containerName="authorino" containerID="cri-o://21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0" gracePeriod=30 Apr 17 17:33:08.568794 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:08.568773 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-s6l8t" Apr 17 17:33:08.708117 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:08.708038 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kkcq\" (UniqueName: \"kubernetes.io/projected/43faa243-e2fe-4326-ac6d-81c0587109ac-kube-api-access-2kkcq\") pod \"43faa243-e2fe-4326-ac6d-81c0587109ac\" (UID: \"43faa243-e2fe-4326-ac6d-81c0587109ac\") " Apr 17 17:33:08.710137 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:08.710115 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43faa243-e2fe-4326-ac6d-81c0587109ac-kube-api-access-2kkcq" (OuterVolumeSpecName: "kube-api-access-2kkcq") pod "43faa243-e2fe-4326-ac6d-81c0587109ac" (UID: "43faa243-e2fe-4326-ac6d-81c0587109ac"). InnerVolumeSpecName "kube-api-access-2kkcq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:33:08.809455 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:08.809420 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2kkcq\" (UniqueName: \"kubernetes.io/projected/43faa243-e2fe-4326-ac6d-81c0587109ac-kube-api-access-2kkcq\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:33:09.345460 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:09.345428 2565 generic.go:358] "Generic (PLEG): container finished" podID="43faa243-e2fe-4326-ac6d-81c0587109ac" containerID="21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0" exitCode=0 Apr 17 17:33:09.345883 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:09.345474 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-s6l8t" Apr 17 17:33:09.345883 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:09.345514 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-s6l8t" event={"ID":"43faa243-e2fe-4326-ac6d-81c0587109ac","Type":"ContainerDied","Data":"21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0"} Apr 17 17:33:09.345883 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:09.345551 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-s6l8t" event={"ID":"43faa243-e2fe-4326-ac6d-81c0587109ac","Type":"ContainerDied","Data":"704ec2ddbc96c271ddfcbd61d00dec8e35402115e672656d12bdbb26e5ab2f13"} Apr 17 17:33:09.345883 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:09.345566 2565 scope.go:117] "RemoveContainer" containerID="21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0" Apr 17 17:33:09.354052 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:09.354036 2565 scope.go:117] "RemoveContainer" containerID="21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0" Apr 17 17:33:09.354320 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:33:09.354301 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0\": container with ID starting with 21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0 not found: ID does not exist" containerID="21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0" Apr 17 17:33:09.354383 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:09.354331 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0"} err="failed to get container status \"21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0\": rpc error: code = NotFound desc = could not find container \"21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0\": container with ID starting with 21d592822d743a0c746d7ed235b58b891f93518c1b79920043d0f70d7b6647c0 not found: ID does not exist" Apr 17 17:33:09.362762 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:09.362738 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-s6l8t"] Apr 17 17:33:09.366470 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:09.366450 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-s6l8t"] Apr 17 17:33:11.195019 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:11.194981 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43faa243-e2fe-4326-ac6d-81c0587109ac" path="/var/lib/kubelet/pods/43faa243-e2fe-4326-ac6d-81c0587109ac/volumes" Apr 17 17:33:29.876339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:29.876302 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-97cwd"] Apr 17 17:33:29.876705 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:29.876659 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43faa243-e2fe-4326-ac6d-81c0587109ac" containerName="authorino" Apr 17 17:33:29.876705 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:29.876669 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="43faa243-e2fe-4326-ac6d-81c0587109ac" containerName="authorino" Apr 17 17:33:29.876778 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:29.876728 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="43faa243-e2fe-4326-ac6d-81c0587109ac" containerName="authorino" Apr 17 17:33:29.879901 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:29.879884 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-97cwd" Apr 17 17:33:29.882257 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:29.882235 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 17:33:29.887934 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:29.887910 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-97cwd"] Apr 17 17:33:29.994017 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:29.993986 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c417a2cc-773d-4c5c-92c9-c3f262edf566-tls-cert\") pod \"authorino-68bd676465-97cwd\" (UID: \"c417a2cc-773d-4c5c-92c9-c3f262edf566\") " pod="kuadrant-system/authorino-68bd676465-97cwd" Apr 17 17:33:29.994211 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:29.994043 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b54fk\" (UniqueName: \"kubernetes.io/projected/c417a2cc-773d-4c5c-92c9-c3f262edf566-kube-api-access-b54fk\") pod \"authorino-68bd676465-97cwd\" (UID: \"c417a2cc-773d-4c5c-92c9-c3f262edf566\") " pod="kuadrant-system/authorino-68bd676465-97cwd" Apr 17 17:33:30.095184 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:30.095150 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c417a2cc-773d-4c5c-92c9-c3f262edf566-tls-cert\") pod \"authorino-68bd676465-97cwd\" (UID: \"c417a2cc-773d-4c5c-92c9-c3f262edf566\") " pod="kuadrant-system/authorino-68bd676465-97cwd" Apr 17 17:33:30.095519 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:30.095221 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b54fk\" (UniqueName: \"kubernetes.io/projected/c417a2cc-773d-4c5c-92c9-c3f262edf566-kube-api-access-b54fk\") pod \"authorino-68bd676465-97cwd\" (UID: \"c417a2cc-773d-4c5c-92c9-c3f262edf566\") " pod="kuadrant-system/authorino-68bd676465-97cwd" Apr 17 17:33:30.097659 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:30.097638 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c417a2cc-773d-4c5c-92c9-c3f262edf566-tls-cert\") pod \"authorino-68bd676465-97cwd\" (UID: \"c417a2cc-773d-4c5c-92c9-c3f262edf566\") " pod="kuadrant-system/authorino-68bd676465-97cwd" Apr 17 17:33:30.103057 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:30.103035 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b54fk\" (UniqueName: \"kubernetes.io/projected/c417a2cc-773d-4c5c-92c9-c3f262edf566-kube-api-access-b54fk\") pod \"authorino-68bd676465-97cwd\" (UID: \"c417a2cc-773d-4c5c-92c9-c3f262edf566\") " pod="kuadrant-system/authorino-68bd676465-97cwd" Apr 17 17:33:30.190436 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:30.190354 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-97cwd" Apr 17 17:33:30.538886 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:30.538857 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-97cwd"] Apr 17 17:33:30.540179 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:33:30.540146 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc417a2cc_773d_4c5c_92c9_c3f262edf566.slice/crio-ae9ae33f8fd5fdfa4375962743ad3d7a31d09d4ce2e43abf96e21a0b4f6b8104 WatchSource:0}: Error finding container ae9ae33f8fd5fdfa4375962743ad3d7a31d09d4ce2e43abf96e21a0b4f6b8104: Status 404 returned error can't find the container with id ae9ae33f8fd5fdfa4375962743ad3d7a31d09d4ce2e43abf96e21a0b4f6b8104 Apr 17 17:33:31.438017 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:31.437980 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-97cwd" event={"ID":"c417a2cc-773d-4c5c-92c9-c3f262edf566","Type":"ContainerStarted","Data":"e18075d3bf20ee8fc873048b24e3d9587364a38a24c2e1c3b6a16defbf21f0e0"} Apr 17 17:33:31.438017 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:31.438018 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-97cwd" event={"ID":"c417a2cc-773d-4c5c-92c9-c3f262edf566","Type":"ContainerStarted","Data":"ae9ae33f8fd5fdfa4375962743ad3d7a31d09d4ce2e43abf96e21a0b4f6b8104"} Apr 17 17:33:31.454746 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:31.454691 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-97cwd" podStartSLOduration=2.025016377 podStartE2EDuration="2.454679417s" podCreationTimestamp="2026-04-17 17:33:29 +0000 UTC" firstStartedPulling="2026-04-17 17:33:30.541362824 +0000 UTC m=+559.873980449" lastFinishedPulling="2026-04-17 17:33:30.971025862 +0000 UTC m=+560.303643489" observedRunningTime="2026-04-17 17:33:31.453518657 +0000 UTC m=+560.786136301" watchObservedRunningTime="2026-04-17 17:33:31.454679417 +0000 UTC m=+560.787297063" Apr 17 17:33:31.484320 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:31.484287 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-2gtjm"] Apr 17 17:33:31.484541 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:31.484511 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-2gtjm" podUID="1c5ba748-fd44-48a8-aa4c-08372515d92d" containerName="authorino" containerID="cri-o://0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca" gracePeriod=30 Apr 17 17:33:31.722935 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:31.722912 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-2gtjm" Apr 17 17:33:31.812081 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:31.812048 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwkwb\" (UniqueName: \"kubernetes.io/projected/1c5ba748-fd44-48a8-aa4c-08372515d92d-kube-api-access-dwkwb\") pod \"1c5ba748-fd44-48a8-aa4c-08372515d92d\" (UID: \"1c5ba748-fd44-48a8-aa4c-08372515d92d\") " Apr 17 17:33:31.814116 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:31.814081 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c5ba748-fd44-48a8-aa4c-08372515d92d-kube-api-access-dwkwb" (OuterVolumeSpecName: "kube-api-access-dwkwb") pod "1c5ba748-fd44-48a8-aa4c-08372515d92d" (UID: "1c5ba748-fd44-48a8-aa4c-08372515d92d"). InnerVolumeSpecName "kube-api-access-dwkwb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:33:31.913629 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:31.913595 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dwkwb\" (UniqueName: \"kubernetes.io/projected/1c5ba748-fd44-48a8-aa4c-08372515d92d-kube-api-access-dwkwb\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:33:32.442719 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:32.442685 2565 generic.go:358] "Generic (PLEG): container finished" podID="1c5ba748-fd44-48a8-aa4c-08372515d92d" containerID="0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca" exitCode=0 Apr 17 17:33:32.443135 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:32.442734 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-2gtjm" Apr 17 17:33:32.443135 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:32.442766 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-2gtjm" event={"ID":"1c5ba748-fd44-48a8-aa4c-08372515d92d","Type":"ContainerDied","Data":"0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca"} Apr 17 17:33:32.443135 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:32.442806 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-2gtjm" event={"ID":"1c5ba748-fd44-48a8-aa4c-08372515d92d","Type":"ContainerDied","Data":"8a556926a0f35b0cb89e73f6ae65603e91166fcf2deb6349a6564001c81f9155"} Apr 17 17:33:32.443135 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:32.442824 2565 scope.go:117] "RemoveContainer" containerID="0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca" Apr 17 17:33:32.454412 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:32.454371 2565 scope.go:117] "RemoveContainer" containerID="0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca" Apr 17 17:33:32.454961 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:33:32.454939 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca\": container with ID starting with 0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca not found: ID does not exist" containerID="0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca" Apr 17 17:33:32.455079 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:32.454972 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca"} err="failed to get container status \"0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca\": rpc error: code = NotFound desc = could not find container \"0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca\": container with ID starting with 0a85c1b4b6b7a12ec800d0574cb1d6322641f74da0fc807b617c1c1d7993d3ca not found: ID does not exist" Apr 17 17:33:32.464649 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:32.464625 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-2gtjm"] Apr 17 17:33:32.468601 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:32.468578 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-2gtjm"] Apr 17 17:33:33.194560 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:33.194529 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c5ba748-fd44-48a8-aa4c-08372515d92d" path="/var/lib/kubelet/pods/1c5ba748-fd44-48a8-aa4c-08372515d92d/volumes" Apr 17 17:33:49.602673 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.602641 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-2m8vh"] Apr 17 17:33:49.603081 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.603031 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c5ba748-fd44-48a8-aa4c-08372515d92d" containerName="authorino" Apr 17 17:33:49.603081 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.603044 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c5ba748-fd44-48a8-aa4c-08372515d92d" containerName="authorino" Apr 17 17:33:49.603154 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.603113 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c5ba748-fd44-48a8-aa4c-08372515d92d" containerName="authorino" Apr 17 17:33:49.606947 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.606925 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-2m8vh" Apr 17 17:33:49.609998 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.609902 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 17:33:49.610172 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.610095 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 17:33:49.614408 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.610347 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 17:33:49.614408 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.610378 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-n2x47\"" Apr 17 17:33:49.621503 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.621478 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-2m8vh"] Apr 17 17:33:49.788054 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.788019 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz9kb\" (UniqueName: \"kubernetes.io/projected/dbed05be-4c57-475a-94bf-b997866f6146-kube-api-access-kz9kb\") pod \"seaweedfs-86cc847c5c-2m8vh\" (UID: \"dbed05be-4c57-475a-94bf-b997866f6146\") " pod="kserve/seaweedfs-86cc847c5c-2m8vh" Apr 17 17:33:49.788223 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.788081 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/dbed05be-4c57-475a-94bf-b997866f6146-data\") pod \"seaweedfs-86cc847c5c-2m8vh\" (UID: \"dbed05be-4c57-475a-94bf-b997866f6146\") " pod="kserve/seaweedfs-86cc847c5c-2m8vh" Apr 17 17:33:49.889603 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.889515 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kz9kb\" (UniqueName: \"kubernetes.io/projected/dbed05be-4c57-475a-94bf-b997866f6146-kube-api-access-kz9kb\") pod \"seaweedfs-86cc847c5c-2m8vh\" (UID: \"dbed05be-4c57-475a-94bf-b997866f6146\") " pod="kserve/seaweedfs-86cc847c5c-2m8vh" Apr 17 17:33:49.889603 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.889566 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/dbed05be-4c57-475a-94bf-b997866f6146-data\") pod \"seaweedfs-86cc847c5c-2m8vh\" (UID: \"dbed05be-4c57-475a-94bf-b997866f6146\") " pod="kserve/seaweedfs-86cc847c5c-2m8vh" Apr 17 17:33:49.890029 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.890011 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/dbed05be-4c57-475a-94bf-b997866f6146-data\") pod \"seaweedfs-86cc847c5c-2m8vh\" (UID: \"dbed05be-4c57-475a-94bf-b997866f6146\") " pod="kserve/seaweedfs-86cc847c5c-2m8vh" Apr 17 17:33:49.901823 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.901791 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz9kb\" (UniqueName: \"kubernetes.io/projected/dbed05be-4c57-475a-94bf-b997866f6146-kube-api-access-kz9kb\") pod \"seaweedfs-86cc847c5c-2m8vh\" (UID: \"dbed05be-4c57-475a-94bf-b997866f6146\") " pod="kserve/seaweedfs-86cc847c5c-2m8vh" Apr 17 17:33:49.920632 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:49.920603 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-2m8vh" Apr 17 17:33:50.045138 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:50.045114 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-2m8vh"] Apr 17 17:33:50.046981 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:33:50.046946 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbed05be_4c57_475a_94bf_b997866f6146.slice/crio-438e0ba17903ff1c8dd0f85cddc784cc74f69977b05a2dddd6c6c81dc9813c01 WatchSource:0}: Error finding container 438e0ba17903ff1c8dd0f85cddc784cc74f69977b05a2dddd6c6c81dc9813c01: Status 404 returned error can't find the container with id 438e0ba17903ff1c8dd0f85cddc784cc74f69977b05a2dddd6c6c81dc9813c01 Apr 17 17:33:50.515325 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:50.515285 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-2m8vh" event={"ID":"dbed05be-4c57-475a-94bf-b997866f6146","Type":"ContainerStarted","Data":"438e0ba17903ff1c8dd0f85cddc784cc74f69977b05a2dddd6c6c81dc9813c01"} Apr 17 17:33:52.524964 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:52.524931 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-2m8vh" event={"ID":"dbed05be-4c57-475a-94bf-b997866f6146","Type":"ContainerStarted","Data":"c4e5b26a09a8faf4a2ed6daf81a2099c8b5d860ec24fa82ddb8b8e96976bdc5e"} Apr 17 17:33:52.525320 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:52.524992 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-2m8vh" Apr 17 17:33:52.543266 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:52.543216 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-2m8vh" podStartSLOduration=1.159807184 podStartE2EDuration="3.543203196s" podCreationTimestamp="2026-04-17 17:33:49 +0000 UTC" firstStartedPulling="2026-04-17 17:33:50.048255412 +0000 UTC m=+579.380873039" lastFinishedPulling="2026-04-17 17:33:52.431651425 +0000 UTC m=+581.764269051" observedRunningTime="2026-04-17 17:33:52.541577555 +0000 UTC m=+581.874195202" watchObservedRunningTime="2026-04-17 17:33:52.543203196 +0000 UTC m=+581.875820841" Apr 17 17:33:58.530032 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:33:58.529999 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-2m8vh" Apr 17 17:34:11.147703 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:34:11.147664 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:34:11.148537 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:34:11.148516 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:35:15.807797 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:15.807763 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-24m8k"] Apr 17 17:35:15.811126 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:15.811108 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-24m8k" Apr 17 17:35:15.818429 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:15.818182 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-24m8k"] Apr 17 17:35:15.967702 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:15.967666 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vh66\" (UniqueName: \"kubernetes.io/projected/49d94812-165f-4f16-8f9e-0fa53631f6c4-kube-api-access-7vh66\") pod \"s3-init-24m8k\" (UID: \"49d94812-165f-4f16-8f9e-0fa53631f6c4\") " pod="kserve/s3-init-24m8k" Apr 17 17:35:16.068781 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:16.068692 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vh66\" (UniqueName: \"kubernetes.io/projected/49d94812-165f-4f16-8f9e-0fa53631f6c4-kube-api-access-7vh66\") pod \"s3-init-24m8k\" (UID: \"49d94812-165f-4f16-8f9e-0fa53631f6c4\") " pod="kserve/s3-init-24m8k" Apr 17 17:35:16.077134 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:16.077110 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vh66\" (UniqueName: \"kubernetes.io/projected/49d94812-165f-4f16-8f9e-0fa53631f6c4-kube-api-access-7vh66\") pod \"s3-init-24m8k\" (UID: \"49d94812-165f-4f16-8f9e-0fa53631f6c4\") " pod="kserve/s3-init-24m8k" Apr 17 17:35:16.120670 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:16.120627 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-24m8k" Apr 17 17:35:16.452596 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:16.452565 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-24m8k"] Apr 17 17:35:16.454691 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:35:16.454665 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49d94812_165f_4f16_8f9e_0fa53631f6c4.slice/crio-a919fc3dfea42891eab1ec11c482452b42c6f452bb2b744fcc29addc9d160b74 WatchSource:0}: Error finding container a919fc3dfea42891eab1ec11c482452b42c6f452bb2b744fcc29addc9d160b74: Status 404 returned error can't find the container with id a919fc3dfea42891eab1ec11c482452b42c6f452bb2b744fcc29addc9d160b74 Apr 17 17:35:16.456544 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:16.456523 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:35:16.851527 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:16.851489 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-24m8k" event={"ID":"49d94812-165f-4f16-8f9e-0fa53631f6c4","Type":"ContainerStarted","Data":"a919fc3dfea42891eab1ec11c482452b42c6f452bb2b744fcc29addc9d160b74"} Apr 17 17:35:21.875297 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:21.875262 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-24m8k" event={"ID":"49d94812-165f-4f16-8f9e-0fa53631f6c4","Type":"ContainerStarted","Data":"4f8046b5b5deca1509e86dfa587fb4d4da296cbad0bb82e513d7fdbeb688db14"} Apr 17 17:35:21.891887 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:21.891815 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-24m8k" podStartSLOduration=2.425161944 podStartE2EDuration="6.891798126s" podCreationTimestamp="2026-04-17 17:35:15 +0000 UTC" firstStartedPulling="2026-04-17 17:35:16.456648916 +0000 UTC m=+665.789266541" lastFinishedPulling="2026-04-17 17:35:20.923285094 +0000 UTC m=+670.255902723" observedRunningTime="2026-04-17 17:35:21.890021056 +0000 UTC m=+671.222638702" watchObservedRunningTime="2026-04-17 17:35:21.891798126 +0000 UTC m=+671.224415813" Apr 17 17:35:24.890029 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:24.889992 2565 generic.go:358] "Generic (PLEG): container finished" podID="49d94812-165f-4f16-8f9e-0fa53631f6c4" containerID="4f8046b5b5deca1509e86dfa587fb4d4da296cbad0bb82e513d7fdbeb688db14" exitCode=0 Apr 17 17:35:24.890427 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:24.890065 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-24m8k" event={"ID":"49d94812-165f-4f16-8f9e-0fa53631f6c4","Type":"ContainerDied","Data":"4f8046b5b5deca1509e86dfa587fb4d4da296cbad0bb82e513d7fdbeb688db14"} Apr 17 17:35:26.022162 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:26.022141 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-24m8k" Apr 17 17:35:26.159461 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:26.159371 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vh66\" (UniqueName: \"kubernetes.io/projected/49d94812-165f-4f16-8f9e-0fa53631f6c4-kube-api-access-7vh66\") pod \"49d94812-165f-4f16-8f9e-0fa53631f6c4\" (UID: \"49d94812-165f-4f16-8f9e-0fa53631f6c4\") " Apr 17 17:35:26.161555 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:26.161521 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d94812-165f-4f16-8f9e-0fa53631f6c4-kube-api-access-7vh66" (OuterVolumeSpecName: "kube-api-access-7vh66") pod "49d94812-165f-4f16-8f9e-0fa53631f6c4" (UID: "49d94812-165f-4f16-8f9e-0fa53631f6c4"). InnerVolumeSpecName "kube-api-access-7vh66". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:35:26.260419 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:26.260378 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7vh66\" (UniqueName: \"kubernetes.io/projected/49d94812-165f-4f16-8f9e-0fa53631f6c4-kube-api-access-7vh66\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:35:26.899025 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:26.898993 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-24m8k" Apr 17 17:35:26.899025 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:26.899016 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-24m8k" event={"ID":"49d94812-165f-4f16-8f9e-0fa53631f6c4","Type":"ContainerDied","Data":"a919fc3dfea42891eab1ec11c482452b42c6f452bb2b744fcc29addc9d160b74"} Apr 17 17:35:26.899230 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:26.899051 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a919fc3dfea42891eab1ec11c482452b42c6f452bb2b744fcc29addc9d160b74" Apr 17 17:35:58.709346 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.709273 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd"] Apr 17 17:35:58.709913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.709694 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49d94812-165f-4f16-8f9e-0fa53631f6c4" containerName="s3-init" Apr 17 17:35:58.709913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.709708 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d94812-165f-4f16-8f9e-0fa53631f6c4" containerName="s3-init" Apr 17 17:35:58.709913 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.709778 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="49d94812-165f-4f16-8f9e-0fa53631f6c4" containerName="s3-init" Apr 17 17:35:58.711740 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.711725 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.715294 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.715268 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 17 17:35:58.715294 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.715286 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:35:58.715468 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.715268 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:35:58.715468 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.715268 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:35:58.720429 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.720397 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd"] Apr 17 17:35:58.837534 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.837495 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1722545b-fa5a-45dc-9cd0-93d574bfa292-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.837722 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.837556 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.837722 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.837585 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq5z5\" (UniqueName: \"kubernetes.io/projected/1722545b-fa5a-45dc-9cd0-93d574bfa292-kube-api-access-vq5z5\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.837722 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.837624 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.837722 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.837704 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.837918 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.837753 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.938882 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.938819 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.938882 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.938885 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vq5z5\" (UniqueName: \"kubernetes.io/projected/1722545b-fa5a-45dc-9cd0-93d574bfa292-kube-api-access-vq5z5\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.939096 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.938918 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.939096 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.938953 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.939096 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.938992 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.939096 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.939042 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1722545b-fa5a-45dc-9cd0-93d574bfa292-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.939345 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.939325 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.939403 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.939367 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.939403 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.939391 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.941096 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.941073 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.941467 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.941451 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1722545b-fa5a-45dc-9cd0-93d574bfa292-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:58.948387 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:58.948359 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq5z5\" (UniqueName: \"kubernetes.io/projected/1722545b-fa5a-45dc-9cd0-93d574bfa292-kube-api-access-vq5z5\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:59.022493 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:59.022405 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:35:59.168883 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:35:59.168812 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd"] Apr 17 17:35:59.173053 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:35:59.173026 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1722545b_fa5a_45dc_9cd0_93d574bfa292.slice/crio-fa4bacab96b3681a2040732e048f8de77eaad4cc8bed061f63027d26ca09c00c WatchSource:0}: Error finding container fa4bacab96b3681a2040732e048f8de77eaad4cc8bed061f63027d26ca09c00c: Status 404 returned error can't find the container with id fa4bacab96b3681a2040732e048f8de77eaad4cc8bed061f63027d26ca09c00c Apr 17 17:36:00.035384 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:00.035341 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" event={"ID":"1722545b-fa5a-45dc-9cd0-93d574bfa292","Type":"ContainerStarted","Data":"fa4bacab96b3681a2040732e048f8de77eaad4cc8bed061f63027d26ca09c00c"} Apr 17 17:36:03.050661 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:03.050624 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" event={"ID":"1722545b-fa5a-45dc-9cd0-93d574bfa292","Type":"ContainerStarted","Data":"3ad25b3807da6a3074f9b9c07a4b85c7910f5e7e19afe16c2ef4a9559a079b5c"} Apr 17 17:36:07.069832 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:07.069798 2565 generic.go:358] "Generic (PLEG): container finished" podID="1722545b-fa5a-45dc-9cd0-93d574bfa292" containerID="3ad25b3807da6a3074f9b9c07a4b85c7910f5e7e19afe16c2ef4a9559a079b5c" exitCode=0 Apr 17 17:36:07.070239 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:07.069875 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" event={"ID":"1722545b-fa5a-45dc-9cd0-93d574bfa292","Type":"ContainerDied","Data":"3ad25b3807da6a3074f9b9c07a4b85c7910f5e7e19afe16c2ef4a9559a079b5c"} Apr 17 17:36:09.080147 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:09.080112 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" event={"ID":"1722545b-fa5a-45dc-9cd0-93d574bfa292","Type":"ContainerStarted","Data":"f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974"} Apr 17 17:36:09.100491 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:09.100444 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" podStartSLOduration=2.148149578 podStartE2EDuration="11.100431316s" podCreationTimestamp="2026-04-17 17:35:58 +0000 UTC" firstStartedPulling="2026-04-17 17:35:59.1752336 +0000 UTC m=+708.507851225" lastFinishedPulling="2026-04-17 17:36:08.127515323 +0000 UTC m=+717.460132963" observedRunningTime="2026-04-17 17:36:09.09887069 +0000 UTC m=+718.431488338" watchObservedRunningTime="2026-04-17 17:36:09.100431316 +0000 UTC m=+718.433048962" Apr 17 17:36:19.022530 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:19.022478 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:36:19.022530 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:19.022535 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:36:19.035425 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:19.035399 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:36:19.128768 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:19.128740 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:36:56.675386 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:56.675353 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd"] Apr 17 17:36:56.675862 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:56.675623 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" podUID="1722545b-fa5a-45dc-9cd0-93d574bfa292" containerName="main" containerID="cri-o://f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974" gracePeriod=30 Apr 17 17:36:56.924540 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:56.924516 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:36:57.065194 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.065165 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1722545b-fa5a-45dc-9cd0-93d574bfa292-tls-certs\") pod \"1722545b-fa5a-45dc-9cd0-93d574bfa292\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " Apr 17 17:36:57.065385 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.065214 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-model-cache\") pod \"1722545b-fa5a-45dc-9cd0-93d574bfa292\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " Apr 17 17:36:57.065385 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.065251 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-kserve-provision-location\") pod \"1722545b-fa5a-45dc-9cd0-93d574bfa292\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " Apr 17 17:36:57.065385 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.065278 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-dshm\") pod \"1722545b-fa5a-45dc-9cd0-93d574bfa292\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " Apr 17 17:36:57.065385 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.065304 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-home\") pod \"1722545b-fa5a-45dc-9cd0-93d574bfa292\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " Apr 17 17:36:57.065385 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.065325 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq5z5\" (UniqueName: \"kubernetes.io/projected/1722545b-fa5a-45dc-9cd0-93d574bfa292-kube-api-access-vq5z5\") pod \"1722545b-fa5a-45dc-9cd0-93d574bfa292\" (UID: \"1722545b-fa5a-45dc-9cd0-93d574bfa292\") " Apr 17 17:36:57.065650 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.065513 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-model-cache" (OuterVolumeSpecName: "model-cache") pod "1722545b-fa5a-45dc-9cd0-93d574bfa292" (UID: "1722545b-fa5a-45dc-9cd0-93d574bfa292"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:57.065650 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.065599 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-home" (OuterVolumeSpecName: "home") pod "1722545b-fa5a-45dc-9cd0-93d574bfa292" (UID: "1722545b-fa5a-45dc-9cd0-93d574bfa292"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:57.067524 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.067492 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-dshm" (OuterVolumeSpecName: "dshm") pod "1722545b-fa5a-45dc-9cd0-93d574bfa292" (UID: "1722545b-fa5a-45dc-9cd0-93d574bfa292"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:57.067666 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.067579 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1722545b-fa5a-45dc-9cd0-93d574bfa292-kube-api-access-vq5z5" (OuterVolumeSpecName: "kube-api-access-vq5z5") pod "1722545b-fa5a-45dc-9cd0-93d574bfa292" (UID: "1722545b-fa5a-45dc-9cd0-93d574bfa292"). InnerVolumeSpecName "kube-api-access-vq5z5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:36:57.067666 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.067588 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1722545b-fa5a-45dc-9cd0-93d574bfa292-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1722545b-fa5a-45dc-9cd0-93d574bfa292" (UID: "1722545b-fa5a-45dc-9cd0-93d574bfa292"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:36:57.118682 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.118636 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1722545b-fa5a-45dc-9cd0-93d574bfa292" (UID: "1722545b-fa5a-45dc-9cd0-93d574bfa292"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:57.166431 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.166397 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-kserve-provision-location\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:36:57.166431 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.166425 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-dshm\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:36:57.166431 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.166434 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-home\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:36:57.166656 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.166442 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vq5z5\" (UniqueName: \"kubernetes.io/projected/1722545b-fa5a-45dc-9cd0-93d574bfa292-kube-api-access-vq5z5\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:36:57.166656 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.166452 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1722545b-fa5a-45dc-9cd0-93d574bfa292-tls-certs\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:36:57.166656 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.166461 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1722545b-fa5a-45dc-9cd0-93d574bfa292-model-cache\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:36:57.266025 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.265991 2565 generic.go:358] "Generic (PLEG): container finished" podID="1722545b-fa5a-45dc-9cd0-93d574bfa292" containerID="f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974" exitCode=0 Apr 17 17:36:57.266209 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.266062 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" event={"ID":"1722545b-fa5a-45dc-9cd0-93d574bfa292","Type":"ContainerDied","Data":"f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974"} Apr 17 17:36:57.266209 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.266079 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" Apr 17 17:36:57.266209 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.266104 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd" event={"ID":"1722545b-fa5a-45dc-9cd0-93d574bfa292","Type":"ContainerDied","Data":"fa4bacab96b3681a2040732e048f8de77eaad4cc8bed061f63027d26ca09c00c"} Apr 17 17:36:57.266209 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.266121 2565 scope.go:117] "RemoveContainer" containerID="f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974" Apr 17 17:36:57.277677 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.277653 2565 scope.go:117] "RemoveContainer" containerID="3ad25b3807da6a3074f9b9c07a4b85c7910f5e7e19afe16c2ef4a9559a079b5c" Apr 17 17:36:57.289480 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.289446 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd"] Apr 17 17:36:57.293339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.293313 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-68fd7d6897hj6jd"] Apr 17 17:36:57.340690 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.340668 2565 scope.go:117] "RemoveContainer" containerID="f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974" Apr 17 17:36:57.341023 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:36:57.341003 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974\": container with ID starting with f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974 not found: ID does not exist" containerID="f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974" Apr 17 17:36:57.341074 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.341033 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974"} err="failed to get container status \"f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974\": rpc error: code = NotFound desc = could not find container \"f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974\": container with ID starting with f2c2dc4d6c2182e4ffb623e553894a108cfff2cc5920ee478372c214ecb9a974 not found: ID does not exist" Apr 17 17:36:57.341074 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.341055 2565 scope.go:117] "RemoveContainer" containerID="3ad25b3807da6a3074f9b9c07a4b85c7910f5e7e19afe16c2ef4a9559a079b5c" Apr 17 17:36:57.341310 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:36:57.341293 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad25b3807da6a3074f9b9c07a4b85c7910f5e7e19afe16c2ef4a9559a079b5c\": container with ID starting with 3ad25b3807da6a3074f9b9c07a4b85c7910f5e7e19afe16c2ef4a9559a079b5c not found: ID does not exist" containerID="3ad25b3807da6a3074f9b9c07a4b85c7910f5e7e19afe16c2ef4a9559a079b5c" Apr 17 17:36:57.341359 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:57.341318 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad25b3807da6a3074f9b9c07a4b85c7910f5e7e19afe16c2ef4a9559a079b5c"} err="failed to get container status \"3ad25b3807da6a3074f9b9c07a4b85c7910f5e7e19afe16c2ef4a9559a079b5c\": rpc error: code = NotFound desc = could not find container \"3ad25b3807da6a3074f9b9c07a4b85c7910f5e7e19afe16c2ef4a9559a079b5c\": container with ID starting with 3ad25b3807da6a3074f9b9c07a4b85c7910f5e7e19afe16c2ef4a9559a079b5c not found: ID does not exist" Apr 17 17:36:59.194220 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:36:59.194186 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1722545b-fa5a-45dc-9cd0-93d574bfa292" path="/var/lib/kubelet/pods/1722545b-fa5a-45dc-9cd0-93d574bfa292/volumes" Apr 17 17:37:25.952406 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.952327 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88"] Apr 17 17:37:25.952775 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.952736 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1722545b-fa5a-45dc-9cd0-93d574bfa292" containerName="main" Apr 17 17:37:25.952775 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.952752 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1722545b-fa5a-45dc-9cd0-93d574bfa292" containerName="main" Apr 17 17:37:25.952775 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.952767 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1722545b-fa5a-45dc-9cd0-93d574bfa292" containerName="storage-initializer" Apr 17 17:37:25.952775 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.952773 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1722545b-fa5a-45dc-9cd0-93d574bfa292" containerName="storage-initializer" Apr 17 17:37:25.952930 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.952858 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="1722545b-fa5a-45dc-9cd0-93d574bfa292" containerName="main" Apr 17 17:37:25.957914 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.957897 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:25.961703 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.961680 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:37:25.962106 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.962087 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:37:25.962232 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.962104 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 17 17:37:25.962232 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.962134 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-j68mk\"" Apr 17 17:37:25.962232 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.962109 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:37:25.972004 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:25.971973 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88"] Apr 17 17:37:26.018434 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.018395 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.018434 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.018435 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnfbr\" (UniqueName: \"kubernetes.io/projected/c80b5021-efca-48c7-ab2f-8867877559fa-kube-api-access-cnfbr\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.018680 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.018532 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.018680 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.018592 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.018680 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.018667 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.019083 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.018708 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.119612 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.119571 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.119819 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.119638 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.119819 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.119665 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnfbr\" (UniqueName: \"kubernetes.io/projected/c80b5021-efca-48c7-ab2f-8867877559fa-kube-api-access-cnfbr\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.119819 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.119732 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.119819 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.119799 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.120101 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.119867 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.120171 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.120145 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.120254 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.120233 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.120316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.120256 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.120316 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.120299 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.122052 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.122033 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.128227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.128208 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnfbr\" (UniqueName: \"kubernetes.io/projected/c80b5021-efca-48c7-ab2f-8867877559fa-kube-api-access-cnfbr\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.267515 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.267428 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:26.405333 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:26.398580 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88"] Apr 17 17:37:26.407371 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:37:26.407337 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc80b5021_efca_48c7_ab2f_8867877559fa.slice/crio-73e34fe34f207137b2eda0bcd63009a331d3f7b38a9b32b1f882f6bfdc400c20 WatchSource:0}: Error finding container 73e34fe34f207137b2eda0bcd63009a331d3f7b38a9b32b1f882f6bfdc400c20: Status 404 returned error can't find the container with id 73e34fe34f207137b2eda0bcd63009a331d3f7b38a9b32b1f882f6bfdc400c20 Apr 17 17:37:27.390880 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:27.390816 2565 generic.go:358] "Generic (PLEG): container finished" podID="c80b5021-efca-48c7-ab2f-8867877559fa" containerID="b513767483e8ac09691d9c979a757198b273c93ee2d7240fb107987d91210a81" exitCode=0 Apr 17 17:37:27.391249 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:27.390903 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" event={"ID":"c80b5021-efca-48c7-ab2f-8867877559fa","Type":"ContainerDied","Data":"b513767483e8ac09691d9c979a757198b273c93ee2d7240fb107987d91210a81"} Apr 17 17:37:27.391249 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:27.390939 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" event={"ID":"c80b5021-efca-48c7-ab2f-8867877559fa","Type":"ContainerStarted","Data":"73e34fe34f207137b2eda0bcd63009a331d3f7b38a9b32b1f882f6bfdc400c20"} Apr 17 17:37:28.397577 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:28.397537 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" event={"ID":"c80b5021-efca-48c7-ab2f-8867877559fa","Type":"ContainerStarted","Data":"ec6303bcb915283099bdd2e4b4a84338fe829dc905df44a8422d70735887606d"} Apr 17 17:37:58.533196 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:58.533158 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" event={"ID":"c80b5021-efca-48c7-ab2f-8867877559fa","Type":"ContainerStarted","Data":"ff868ff02e792870b44c8323ccc2230dc8581cb124c06dc42b5a701fd6b76841"} Apr 17 17:37:58.533666 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:58.533353 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:37:58.536199 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:58.536152 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 17 17:37:58.555791 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:58.555741 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" podStartSLOduration=3.026286169 podStartE2EDuration="33.555729724s" podCreationTimestamp="2026-04-17 17:37:25 +0000 UTC" firstStartedPulling="2026-04-17 17:37:27.391993198 +0000 UTC m=+796.724610821" lastFinishedPulling="2026-04-17 17:37:57.92143674 +0000 UTC m=+827.254054376" observedRunningTime="2026-04-17 17:37:58.552301284 +0000 UTC m=+827.884918927" watchObservedRunningTime="2026-04-17 17:37:58.555729724 +0000 UTC m=+827.888347368" Apr 17 17:37:58.693232 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:58.693199 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88"] Apr 17 17:37:59.537052 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:59.537012 2565 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" secret="" err="secret \"scheduler-ha-replicas-test-epp-sa-dockercfg-j68mk\" not found" Apr 17 17:37:59.538360 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:37:59.538322 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 17 17:37:59.637423 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:37:59.637385 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 17 17:37:59.637619 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:37:59.637485 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs podName:c80b5021-efca-48c7-ab2f-8867877559fa nodeName:}" failed. No retries permitted until 2026-04-17 17:38:00.137462307 +0000 UTC m=+829.470079949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" (UID: "c80b5021-efca-48c7-ab2f-8867877559fa") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 17 17:38:00.142831 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:38:00.142793 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 17 17:38:00.143016 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:38:00.142887 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs podName:c80b5021-efca-48c7-ab2f-8867877559fa nodeName:}" failed. No retries permitted until 2026-04-17 17:38:01.142867597 +0000 UTC m=+830.475485224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" (UID: "c80b5021-efca-48c7-ab2f-8867877559fa") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 17 17:38:00.541904 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:00.541775 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="main" containerID="cri-o://ec6303bcb915283099bdd2e4b4a84338fe829dc905df44a8422d70735887606d" gracePeriod=30 Apr 17 17:38:00.541904 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:00.541783 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="tokenizer" containerID="cri-o://ff868ff02e792870b44c8323ccc2230dc8581cb124c06dc42b5a701fd6b76841" gracePeriod=30 Apr 17 17:38:00.543516 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:00.543483 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 17 17:38:01.152947 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:38:01.152917 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 17 17:38:01.153103 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:38:01.152983 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs podName:c80b5021-efca-48c7-ab2f-8867877559fa nodeName:}" failed. No retries permitted until 2026-04-17 17:38:03.152967544 +0000 UTC m=+832.485585167 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" (UID: "c80b5021-efca-48c7-ab2f-8867877559fa") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 17 17:38:01.547415 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:01.547331 2565 generic.go:358] "Generic (PLEG): container finished" podID="c80b5021-efca-48c7-ab2f-8867877559fa" containerID="ec6303bcb915283099bdd2e4b4a84338fe829dc905df44a8422d70735887606d" exitCode=0 Apr 17 17:38:01.547415 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:01.547399 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" event={"ID":"c80b5021-efca-48c7-ab2f-8867877559fa","Type":"ContainerDied","Data":"ec6303bcb915283099bdd2e4b4a84338fe829dc905df44a8422d70735887606d"} Apr 17 17:38:03.170945 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:38:03.170910 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 17 17:38:03.171337 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:38:03.170982 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs podName:c80b5021-efca-48c7-ab2f-8867877559fa nodeName:}" failed. No retries permitted until 2026-04-17 17:38:07.170968879 +0000 UTC m=+836.503586502 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" (UID: "c80b5021-efca-48c7-ab2f-8867877559fa") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 17 17:38:06.267534 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:06.267500 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:38:07.211693 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:38:07.211663 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 17 17:38:07.211895 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:38:07.211730 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs podName:c80b5021-efca-48c7-ab2f-8867877559fa nodeName:}" failed. No retries permitted until 2026-04-17 17:38:15.211715887 +0000 UTC m=+844.544333511 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" (UID: "c80b5021-efca-48c7-ab2f-8867877559fa") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 17 17:38:10.542398 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:38:10.542369 2565 logging.go:55] [core] [Channel #26 SubChannel #27]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.51:9003", ServerName: "10.132.0.51:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.51:9003: connect: connection refused" Apr 17 17:38:11.543022 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:11.542977 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.51:9003\" within 1s: context deadline exceeded" Apr 17 17:38:15.290259 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:38:15.290221 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 17 17:38:15.290644 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:38:15.290296 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs podName:c80b5021-efca-48c7-ab2f-8867877559fa nodeName:}" failed. No retries permitted until 2026-04-17 17:38:31.290282049 +0000 UTC m=+860.622899672 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" (UID: "c80b5021-efca-48c7-ab2f-8867877559fa") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 17 17:38:20.542538 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:38:20.542507 2565 logging.go:55] [core] [Channel #28 SubChannel #29]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.51:9003", ServerName: "10.132.0.51:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.51:9003: connect: connection refused" Apr 17 17:38:21.542944 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:21.542899 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.51:9003\" within 1s: context deadline exceeded" Apr 17 17:38:30.543004 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:38:30.542971 2565 logging.go:55] [core] [Channel #30 SubChannel #31]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.51:9003", ServerName: "10.132.0.51:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.51:9003: connect: connection refused" Apr 17 17:38:30.617344 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:38:30.617312 2565 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc80b5021_efca_48c7_ab2f_8867877559fa.slice/crio-conmon-ff868ff02e792870b44c8323ccc2230dc8581cb124c06dc42b5a701fd6b76841.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc80b5021_efca_48c7_ab2f_8867877559fa.slice/crio-ff868ff02e792870b44c8323ccc2230dc8581cb124c06dc42b5a701fd6b76841.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:38:30.671674 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.671648 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88_c80b5021-efca-48c7-ab2f-8867877559fa/tokenizer/0.log" Apr 17 17:38:30.672439 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.672407 2565 generic.go:358] "Generic (PLEG): container finished" podID="c80b5021-efca-48c7-ab2f-8867877559fa" containerID="ff868ff02e792870b44c8323ccc2230dc8581cb124c06dc42b5a701fd6b76841" exitCode=137 Apr 17 17:38:30.672537 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.672500 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" event={"ID":"c80b5021-efca-48c7-ab2f-8867877559fa","Type":"ContainerDied","Data":"ff868ff02e792870b44c8323ccc2230dc8581cb124c06dc42b5a701fd6b76841"} Apr 17 17:38:30.753023 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.752996 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88_c80b5021-efca-48c7-ab2f-8867877559fa/tokenizer/0.log" Apr 17 17:38:30.753638 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.753620 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:38:30.836734 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.836701 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-uds\") pod \"c80b5021-efca-48c7-ab2f-8867877559fa\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " Apr 17 17:38:30.836734 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.836739 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs\") pod \"c80b5021-efca-48c7-ab2f-8867877559fa\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " Apr 17 17:38:30.837002 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.836758 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-cache\") pod \"c80b5021-efca-48c7-ab2f-8867877559fa\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " Apr 17 17:38:30.837002 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.836798 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-tmp\") pod \"c80b5021-efca-48c7-ab2f-8867877559fa\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " Apr 17 17:38:30.837002 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.836858 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnfbr\" (UniqueName: \"kubernetes.io/projected/c80b5021-efca-48c7-ab2f-8867877559fa-kube-api-access-cnfbr\") pod \"c80b5021-efca-48c7-ab2f-8867877559fa\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " Apr 17 17:38:30.837002 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.836905 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-kserve-provision-location\") pod \"c80b5021-efca-48c7-ab2f-8867877559fa\" (UID: \"c80b5021-efca-48c7-ab2f-8867877559fa\") " Apr 17 17:38:30.837227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.837056 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c80b5021-efca-48c7-ab2f-8867877559fa" (UID: "c80b5021-efca-48c7-ab2f-8867877559fa"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:30.837227 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.837167 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c80b5021-efca-48c7-ab2f-8867877559fa" (UID: "c80b5021-efca-48c7-ab2f-8867877559fa"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:30.837304 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.837227 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-cache\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:38:30.837807 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.837777 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c80b5021-efca-48c7-ab2f-8867877559fa" (UID: "c80b5021-efca-48c7-ab2f-8867877559fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:30.837903 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.837884 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c80b5021-efca-48c7-ab2f-8867877559fa" (UID: "c80b5021-efca-48c7-ab2f-8867877559fa"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:30.838998 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.838958 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c80b5021-efca-48c7-ab2f-8867877559fa-kube-api-access-cnfbr" (OuterVolumeSpecName: "kube-api-access-cnfbr") pod "c80b5021-efca-48c7-ab2f-8867877559fa" (UID: "c80b5021-efca-48c7-ab2f-8867877559fa"). InnerVolumeSpecName "kube-api-access-cnfbr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:30.839069 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.839051 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c80b5021-efca-48c7-ab2f-8867877559fa" (UID: "c80b5021-efca-48c7-ab2f-8867877559fa"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:38:30.938388 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.938354 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-uds\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:38:30.938388 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.938383 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c80b5021-efca-48c7-ab2f-8867877559fa-tls-certs\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:38:30.938388 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.938392 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-tokenizer-tmp\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:38:30.938616 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.938401 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cnfbr\" (UniqueName: \"kubernetes.io/projected/c80b5021-efca-48c7-ab2f-8867877559fa-kube-api-access-cnfbr\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:38:30.938616 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:30.938410 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c80b5021-efca-48c7-ab2f-8867877559fa-kserve-provision-location\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:38:31.543705 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:31.543598 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.51:9003\" within 1s: context deadline exceeded" Apr 17 17:38:31.677403 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:31.677372 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88_c80b5021-efca-48c7-ab2f-8867877559fa/tokenizer/0.log" Apr 17 17:38:31.678081 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:31.678062 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" Apr 17 17:38:31.678198 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:31.678059 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88" event={"ID":"c80b5021-efca-48c7-ab2f-8867877559fa","Type":"ContainerDied","Data":"73e34fe34f207137b2eda0bcd63009a331d3f7b38a9b32b1f882f6bfdc400c20"} Apr 17 17:38:31.678198 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:31.678185 2565 scope.go:117] "RemoveContainer" containerID="ff868ff02e792870b44c8323ccc2230dc8581cb124c06dc42b5a701fd6b76841" Apr 17 17:38:31.686982 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:31.686962 2565 scope.go:117] "RemoveContainer" containerID="ec6303bcb915283099bdd2e4b4a84338fe829dc905df44a8422d70735887606d" Apr 17 17:38:31.695009 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:31.694991 2565 scope.go:117] "RemoveContainer" containerID="b513767483e8ac09691d9c979a757198b273c93ee2d7240fb107987d91210a81" Apr 17 17:38:31.700718 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:31.700695 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88"] Apr 17 17:38:31.705609 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:31.705589 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bbjv88"] Apr 17 17:38:33.194778 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:38:33.194746 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" path="/var/lib/kubelet/pods/c80b5021-efca-48c7-ab2f-8867877559fa/volumes" Apr 17 17:39:01.054604 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.054526 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9"] Apr 17 17:39:01.055220 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.055196 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="tokenizer" Apr 17 17:39:01.055335 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.055222 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="tokenizer" Apr 17 17:39:01.055335 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.055260 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="main" Apr 17 17:39:01.055335 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.055269 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="main" Apr 17 17:39:01.055335 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.055280 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="storage-initializer" Apr 17 17:39:01.055335 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.055290 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="storage-initializer" Apr 17 17:39:01.055520 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.055402 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="main" Apr 17 17:39:01.055520 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.055421 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c80b5021-efca-48c7-ab2f-8867877559fa" containerName="tokenizer" Apr 17 17:39:01.059268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.059250 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.061953 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.061931 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:39:01.062060 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.061983 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-nz5fg\"" Apr 17 17:39:01.062060 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.062021 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:39:01.062154 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.061983 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:39:01.062979 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.062964 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 17 17:39:01.069106 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.069082 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9"] Apr 17 17:39:01.090292 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.090267 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31412d80-4ead-4c24-a577-524b451c1920-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.090430 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.090314 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.090430 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.090338 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcz5x\" (UniqueName: \"kubernetes.io/projected/31412d80-4ead-4c24-a577-524b451c1920-kube-api-access-jcz5x\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.090561 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.090436 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.090561 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.090498 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.090561 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.090528 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.191321 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.191286 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.191321 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.191326 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.191582 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.191383 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31412d80-4ead-4c24-a577-524b451c1920-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.191582 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.191419 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.191582 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.191448 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcz5x\" (UniqueName: \"kubernetes.io/projected/31412d80-4ead-4c24-a577-524b451c1920-kube-api-access-jcz5x\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.191582 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.191491 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.191815 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.191751 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.191815 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.191764 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.191949 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.191914 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.191949 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.191916 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.194123 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.194094 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31412d80-4ead-4c24-a577-524b451c1920-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.200148 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.200125 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcz5x\" (UniqueName: \"kubernetes.io/projected/31412d80-4ead-4c24-a577-524b451c1920-kube-api-access-jcz5x\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.387051 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.386977 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:01.524501 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.524476 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9"] Apr 17 17:39:01.526158 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:39:01.526131 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31412d80_4ead_4c24_a577_524b451c1920.slice/crio-445aea4817d75bccb6111e15d8bf0b0d40e9b3563c42bf814fc94086d75421f2 WatchSource:0}: Error finding container 445aea4817d75bccb6111e15d8bf0b0d40e9b3563c42bf814fc94086d75421f2: Status 404 returned error can't find the container with id 445aea4817d75bccb6111e15d8bf0b0d40e9b3563c42bf814fc94086d75421f2 Apr 17 17:39:01.801169 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.801130 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" event={"ID":"31412d80-4ead-4c24-a577-524b451c1920","Type":"ContainerStarted","Data":"4dfe12655f296798d3893c92a911bc349af118f4105977c89eea42b71e96dd35"} Apr 17 17:39:01.801169 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:01.801169 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" event={"ID":"31412d80-4ead-4c24-a577-524b451c1920","Type":"ContainerStarted","Data":"445aea4817d75bccb6111e15d8bf0b0d40e9b3563c42bf814fc94086d75421f2"} Apr 17 17:39:02.807205 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:02.807169 2565 generic.go:358] "Generic (PLEG): container finished" podID="31412d80-4ead-4c24-a577-524b451c1920" containerID="4dfe12655f296798d3893c92a911bc349af118f4105977c89eea42b71e96dd35" exitCode=0 Apr 17 17:39:02.807684 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:02.807234 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" event={"ID":"31412d80-4ead-4c24-a577-524b451c1920","Type":"ContainerDied","Data":"4dfe12655f296798d3893c92a911bc349af118f4105977c89eea42b71e96dd35"} Apr 17 17:39:03.813017 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:03.812981 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" event={"ID":"31412d80-4ead-4c24-a577-524b451c1920","Type":"ContainerStarted","Data":"ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51"} Apr 17 17:39:03.813017 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:03.813020 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" event={"ID":"31412d80-4ead-4c24-a577-524b451c1920","Type":"ContainerStarted","Data":"4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794"} Apr 17 17:39:03.813450 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:03.813053 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:03.847087 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:03.847028 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" podStartSLOduration=2.847007771 podStartE2EDuration="2.847007771s" podCreationTimestamp="2026-04-17 17:39:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:39:03.845084196 +0000 UTC m=+893.177701852" watchObservedRunningTime="2026-04-17 17:39:03.847007771 +0000 UTC m=+893.179625420" Apr 17 17:39:11.184670 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:11.184641 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:39:11.189456 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:11.189427 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:39:11.387651 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:11.387612 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:11.387651 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:11.387660 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:11.390094 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:11.390071 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:11.855404 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:11.855370 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:39:32.860068 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:39:32.860040 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:41:01.284116 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:01.284070 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9"] Apr 17 17:41:01.284607 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:01.284518 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" podUID="31412d80-4ead-4c24-a577-524b451c1920" containerName="tokenizer" containerID="cri-o://ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51" gracePeriod=30 Apr 17 17:41:01.284743 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:01.284704 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" podUID="31412d80-4ead-4c24-a577-524b451c1920" containerName="main" containerID="cri-o://4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794" gracePeriod=30 Apr 17 17:41:01.855241 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:01.855190 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" podUID="31412d80-4ead-4c24-a577-524b451c1920" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.52:8082/healthz\": dial tcp 10.132.0.52:8082: connect: connection refused" Apr 17 17:41:02.297305 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.297213 2565 generic.go:358] "Generic (PLEG): container finished" podID="31412d80-4ead-4c24-a577-524b451c1920" containerID="4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794" exitCode=0 Apr 17 17:41:02.297305 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.297293 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" event={"ID":"31412d80-4ead-4c24-a577-524b451c1920","Type":"ContainerDied","Data":"4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794"} Apr 17 17:41:02.630650 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.630625 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:41:02.654616 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.654581 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-tmp\") pod \"31412d80-4ead-4c24-a577-524b451c1920\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " Apr 17 17:41:02.654616 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.654624 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-uds\") pod \"31412d80-4ead-4c24-a577-524b451c1920\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " Apr 17 17:41:02.654616 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.654661 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-cache\") pod \"31412d80-4ead-4c24-a577-524b451c1920\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " Apr 17 17:41:02.656311 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.654694 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31412d80-4ead-4c24-a577-524b451c1920-tls-certs\") pod \"31412d80-4ead-4c24-a577-524b451c1920\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " Apr 17 17:41:02.656311 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.654748 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-kserve-provision-location\") pod \"31412d80-4ead-4c24-a577-524b451c1920\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " Apr 17 17:41:02.656311 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.654850 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcz5x\" (UniqueName: \"kubernetes.io/projected/31412d80-4ead-4c24-a577-524b451c1920-kube-api-access-jcz5x\") pod \"31412d80-4ead-4c24-a577-524b451c1920\" (UID: \"31412d80-4ead-4c24-a577-524b451c1920\") " Apr 17 17:41:02.656311 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.654971 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "31412d80-4ead-4c24-a577-524b451c1920" (UID: "31412d80-4ead-4c24-a577-524b451c1920"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:02.656311 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.654995 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "31412d80-4ead-4c24-a577-524b451c1920" (UID: "31412d80-4ead-4c24-a577-524b451c1920"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:02.656311 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.655041 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "31412d80-4ead-4c24-a577-524b451c1920" (UID: "31412d80-4ead-4c24-a577-524b451c1920"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:02.656311 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.655213 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-uds\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:41:02.656311 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.655669 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "31412d80-4ead-4c24-a577-524b451c1920" (UID: "31412d80-4ead-4c24-a577-524b451c1920"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:02.656311 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.655692 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-cache\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:41:02.657359 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.657332 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31412d80-4ead-4c24-a577-524b451c1920-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "31412d80-4ead-4c24-a577-524b451c1920" (UID: "31412d80-4ead-4c24-a577-524b451c1920"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:41:02.658032 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.658002 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31412d80-4ead-4c24-a577-524b451c1920-kube-api-access-jcz5x" (OuterVolumeSpecName: "kube-api-access-jcz5x") pod "31412d80-4ead-4c24-a577-524b451c1920" (UID: "31412d80-4ead-4c24-a577-524b451c1920"). InnerVolumeSpecName "kube-api-access-jcz5x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:41:02.757005 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.756966 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-tokenizer-tmp\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:41:02.757005 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.756999 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31412d80-4ead-4c24-a577-524b451c1920-tls-certs\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:41:02.757005 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.757009 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/31412d80-4ead-4c24-a577-524b451c1920-kserve-provision-location\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:41:02.757241 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:02.757020 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jcz5x\" (UniqueName: \"kubernetes.io/projected/31412d80-4ead-4c24-a577-524b451c1920-kube-api-access-jcz5x\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:41:03.303139 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.303101 2565 generic.go:358] "Generic (PLEG): container finished" podID="31412d80-4ead-4c24-a577-524b451c1920" containerID="ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51" exitCode=0 Apr 17 17:41:03.303611 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.303186 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" event={"ID":"31412d80-4ead-4c24-a577-524b451c1920","Type":"ContainerDied","Data":"ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51"} Apr 17 17:41:03.303611 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.303208 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" Apr 17 17:41:03.303611 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.303221 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9" event={"ID":"31412d80-4ead-4c24-a577-524b451c1920","Type":"ContainerDied","Data":"445aea4817d75bccb6111e15d8bf0b0d40e9b3563c42bf814fc94086d75421f2"} Apr 17 17:41:03.303611 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.303241 2565 scope.go:117] "RemoveContainer" containerID="ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51" Apr 17 17:41:03.312759 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.312737 2565 scope.go:117] "RemoveContainer" containerID="4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794" Apr 17 17:41:03.321816 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.321795 2565 scope.go:117] "RemoveContainer" containerID="4dfe12655f296798d3893c92a911bc349af118f4105977c89eea42b71e96dd35" Apr 17 17:41:03.324489 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.324440 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9"] Apr 17 17:41:03.327896 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.327871 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-ck6v9"] Apr 17 17:41:03.330915 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.330903 2565 scope.go:117] "RemoveContainer" containerID="ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51" Apr 17 17:41:03.331198 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:41:03.331180 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51\": container with ID starting with ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51 not found: ID does not exist" containerID="ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51" Apr 17 17:41:03.331243 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.331209 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51"} err="failed to get container status \"ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51\": rpc error: code = NotFound desc = could not find container \"ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51\": container with ID starting with ed08ea711366b2aa4ecb596697f8ae540adae66caf68d1c3b9fe14bc0531da51 not found: ID does not exist" Apr 17 17:41:03.331243 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.331227 2565 scope.go:117] "RemoveContainer" containerID="4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794" Apr 17 17:41:03.331472 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:41:03.331455 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794\": container with ID starting with 4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794 not found: ID does not exist" containerID="4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794" Apr 17 17:41:03.331521 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.331476 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794"} err="failed to get container status \"4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794\": rpc error: code = NotFound desc = could not find container \"4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794\": container with ID starting with 4ca1f31659493113bc254dfbdbf64e1b6fbb862511823d0962c531eb77e89794 not found: ID does not exist" Apr 17 17:41:03.331521 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.331490 2565 scope.go:117] "RemoveContainer" containerID="4dfe12655f296798d3893c92a911bc349af118f4105977c89eea42b71e96dd35" Apr 17 17:41:03.331685 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:41:03.331661 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dfe12655f296798d3893c92a911bc349af118f4105977c89eea42b71e96dd35\": container with ID starting with 4dfe12655f296798d3893c92a911bc349af118f4105977c89eea42b71e96dd35 not found: ID does not exist" containerID="4dfe12655f296798d3893c92a911bc349af118f4105977c89eea42b71e96dd35" Apr 17 17:41:03.331742 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:03.331695 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dfe12655f296798d3893c92a911bc349af118f4105977c89eea42b71e96dd35"} err="failed to get container status \"4dfe12655f296798d3893c92a911bc349af118f4105977c89eea42b71e96dd35\": rpc error: code = NotFound desc = could not find container \"4dfe12655f296798d3893c92a911bc349af118f4105977c89eea42b71e96dd35\": container with ID starting with 4dfe12655f296798d3893c92a911bc349af118f4105977c89eea42b71e96dd35 not found: ID does not exist" Apr 17 17:41:05.194781 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:05.194745 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31412d80-4ead-4c24-a577-524b451c1920" path="/var/lib/kubelet/pods/31412d80-4ead-4c24-a577-524b451c1920/volumes" Apr 17 17:41:06.829043 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.829005 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj"] Apr 17 17:41:06.829586 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.829569 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31412d80-4ead-4c24-a577-524b451c1920" containerName="storage-initializer" Apr 17 17:41:06.829638 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.829590 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="31412d80-4ead-4c24-a577-524b451c1920" containerName="storage-initializer" Apr 17 17:41:06.829638 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.829619 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31412d80-4ead-4c24-a577-524b451c1920" containerName="tokenizer" Apr 17 17:41:06.829638 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.829628 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="31412d80-4ead-4c24-a577-524b451c1920" containerName="tokenizer" Apr 17 17:41:06.829732 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.829650 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31412d80-4ead-4c24-a577-524b451c1920" containerName="main" Apr 17 17:41:06.829732 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.829660 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="31412d80-4ead-4c24-a577-524b451c1920" containerName="main" Apr 17 17:41:06.829799 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.829752 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="31412d80-4ead-4c24-a577-524b451c1920" containerName="tokenizer" Apr 17 17:41:06.829799 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.829765 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="31412d80-4ead-4c24-a577-524b451c1920" containerName="main" Apr 17 17:41:06.835258 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.835240 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.838079 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.838054 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 17 17:41:06.838238 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.838121 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:41:06.839258 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.839235 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-fnt76\"" Apr 17 17:41:06.839414 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.839356 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:41:06.839414 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.839383 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:41:06.843462 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.843236 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj"] Apr 17 17:41:06.892803 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.892763 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.892803 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.892807 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lrld\" (UniqueName: \"kubernetes.io/projected/ad2d3473-0ad5-4c98-9186-f2951aa434b3-kube-api-access-2lrld\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.893034 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.892891 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.893034 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.892919 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.893034 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.893021 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.893137 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.893048 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.993634 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.993595 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.993634 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.993638 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.993914 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.993681 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.993914 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.993702 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lrld\" (UniqueName: \"kubernetes.io/projected/ad2d3473-0ad5-4c98-9186-f2951aa434b3-kube-api-access-2lrld\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.993914 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.993732 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.993914 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.993748 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.994198 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.994176 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.994268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.994207 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.994268 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.994238 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.994356 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.994280 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:06.996278 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:06.996252 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:07.002054 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:07.002032 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lrld\" (UniqueName: \"kubernetes.io/projected/ad2d3473-0ad5-4c98-9186-f2951aa434b3-kube-api-access-2lrld\") pod \"stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:07.146354 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:07.146269 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:07.283164 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:07.283133 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj"] Apr 17 17:41:07.285162 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:41:07.285123 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad2d3473_0ad5_4c98_9186_f2951aa434b3.slice/crio-49c499dab49968f58e4d24427decf7da07d2adc2cfd8ab7748be8cf8234669c9 WatchSource:0}: Error finding container 49c499dab49968f58e4d24427decf7da07d2adc2cfd8ab7748be8cf8234669c9: Status 404 returned error can't find the container with id 49c499dab49968f58e4d24427decf7da07d2adc2cfd8ab7748be8cf8234669c9 Apr 17 17:41:07.287222 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:07.287200 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:41:07.319564 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:07.319535 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" event={"ID":"ad2d3473-0ad5-4c98-9186-f2951aa434b3","Type":"ContainerStarted","Data":"49c499dab49968f58e4d24427decf7da07d2adc2cfd8ab7748be8cf8234669c9"} Apr 17 17:41:08.326910 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:08.326877 2565 generic.go:358] "Generic (PLEG): container finished" podID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerID="a7e1b9f17afaa4505fafc81396cfe1da29895f06b15a77d21077837de2f329a3" exitCode=0 Apr 17 17:41:08.327266 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:08.326968 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" event={"ID":"ad2d3473-0ad5-4c98-9186-f2951aa434b3","Type":"ContainerDied","Data":"a7e1b9f17afaa4505fafc81396cfe1da29895f06b15a77d21077837de2f329a3"} Apr 17 17:41:09.333518 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:09.333482 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" event={"ID":"ad2d3473-0ad5-4c98-9186-f2951aa434b3","Type":"ContainerStarted","Data":"edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00"} Apr 17 17:41:09.333518 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:09.333522 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" event={"ID":"ad2d3473-0ad5-4c98-9186-f2951aa434b3","Type":"ContainerStarted","Data":"bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a"} Apr 17 17:41:09.334001 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:09.333664 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:09.356742 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:09.356688 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" podStartSLOduration=3.35667311 podStartE2EDuration="3.35667311s" podCreationTimestamp="2026-04-17 17:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:41:09.353910285 +0000 UTC m=+1018.686527928" watchObservedRunningTime="2026-04-17 17:41:09.35667311 +0000 UTC m=+1018.689290755" Apr 17 17:41:17.146910 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:17.146867 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:17.146910 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:17.146919 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:17.149622 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:17.149595 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:17.364570 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:17.364542 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:41:38.369627 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:41:38.369597 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:42:47.625564 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:47.625525 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj"] Apr 17 17:42:47.626129 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:47.625955 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" podUID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerName="main" containerID="cri-o://bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a" gracePeriod=30 Apr 17 17:42:47.626129 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:47.626031 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" podUID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerName="tokenizer" containerID="cri-o://edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00" gracePeriod=30 Apr 17 17:42:48.369133 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:42:48.369096 2565 logging.go:55] [core] [Channel #143 SubChannel #144]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.53:9003", ServerName: "10.132.0.53:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.53:9003: connect: connection refused" Apr 17 17:42:48.728764 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.728733 2565 generic.go:358] "Generic (PLEG): container finished" podID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerID="bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a" exitCode=0 Apr 17 17:42:48.729123 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.728801 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" event={"ID":"ad2d3473-0ad5-4c98-9186-f2951aa434b3","Type":"ContainerDied","Data":"bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a"} Apr 17 17:42:48.881456 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.881435 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:42:48.990087 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.989999 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-uds\") pod \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " Apr 17 17:42:48.990087 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.990050 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-kserve-provision-location\") pod \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " Apr 17 17:42:48.990326 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.990106 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lrld\" (UniqueName: \"kubernetes.io/projected/ad2d3473-0ad5-4c98-9186-f2951aa434b3-kube-api-access-2lrld\") pod \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " Apr 17 17:42:48.990326 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.990140 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-tmp\") pod \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " Apr 17 17:42:48.990326 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.990170 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tls-certs\") pod \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " Apr 17 17:42:48.990326 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.990193 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-cache\") pod \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\" (UID: \"ad2d3473-0ad5-4c98-9186-f2951aa434b3\") " Apr 17 17:42:48.990605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.990313 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ad2d3473-0ad5-4c98-9186-f2951aa434b3" (UID: "ad2d3473-0ad5-4c98-9186-f2951aa434b3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:42:48.990605 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.990531 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ad2d3473-0ad5-4c98-9186-f2951aa434b3" (UID: "ad2d3473-0ad5-4c98-9186-f2951aa434b3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:42:48.990700 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.990602 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ad2d3473-0ad5-4c98-9186-f2951aa434b3" (UID: "ad2d3473-0ad5-4c98-9186-f2951aa434b3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:42:48.990700 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.990605 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-uds\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:42:48.990888 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.990864 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ad2d3473-0ad5-4c98-9186-f2951aa434b3" (UID: "ad2d3473-0ad5-4c98-9186-f2951aa434b3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:42:48.992304 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.992283 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ad2d3473-0ad5-4c98-9186-f2951aa434b3" (UID: "ad2d3473-0ad5-4c98-9186-f2951aa434b3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:42:48.992469 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:48.992452 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2d3473-0ad5-4c98-9186-f2951aa434b3-kube-api-access-2lrld" (OuterVolumeSpecName: "kube-api-access-2lrld") pod "ad2d3473-0ad5-4c98-9186-f2951aa434b3" (UID: "ad2d3473-0ad5-4c98-9186-f2951aa434b3"). InnerVolumeSpecName "kube-api-access-2lrld". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:42:49.091618 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.091581 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2lrld\" (UniqueName: \"kubernetes.io/projected/ad2d3473-0ad5-4c98-9186-f2951aa434b3-kube-api-access-2lrld\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:42:49.091618 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.091613 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-tmp\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:42:49.091618 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.091624 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tls-certs\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:42:49.091894 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.091634 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-tokenizer-cache\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:42:49.091894 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.091643 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2d3473-0ad5-4c98-9186-f2951aa434b3-kserve-provision-location\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:42:49.369011 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.368965 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" podUID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.53:9003\" within 1s: context deadline exceeded" Apr 17 17:42:49.735518 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.735435 2565 generic.go:358] "Generic (PLEG): container finished" podID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerID="edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00" exitCode=0 Apr 17 17:42:49.736007 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.735511 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" event={"ID":"ad2d3473-0ad5-4c98-9186-f2951aa434b3","Type":"ContainerDied","Data":"edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00"} Apr 17 17:42:49.736007 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.735541 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" Apr 17 17:42:49.736007 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.735549 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj" event={"ID":"ad2d3473-0ad5-4c98-9186-f2951aa434b3","Type":"ContainerDied","Data":"49c499dab49968f58e4d24427decf7da07d2adc2cfd8ab7748be8cf8234669c9"} Apr 17 17:42:49.736007 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.735573 2565 scope.go:117] "RemoveContainer" containerID="edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00" Apr 17 17:42:49.744812 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.744557 2565 scope.go:117] "RemoveContainer" containerID="bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a" Apr 17 17:42:49.752698 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.752675 2565 scope.go:117] "RemoveContainer" containerID="a7e1b9f17afaa4505fafc81396cfe1da29895f06b15a77d21077837de2f329a3" Apr 17 17:42:49.758488 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.758463 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj"] Apr 17 17:42:49.761637 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.761615 2565 scope.go:117] "RemoveContainer" containerID="edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00" Apr 17 17:42:49.761730 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.761711 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-65cfd8b5fb-279cj"] Apr 17 17:42:49.761952 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:42:49.761928 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00\": container with ID starting with edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00 not found: ID does not exist" containerID="edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00" Apr 17 17:42:49.762014 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.761963 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00"} err="failed to get container status \"edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00\": rpc error: code = NotFound desc = could not find container \"edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00\": container with ID starting with edbdb7a8bedc3110376e6c3dbe553f3cb57d117991752bf637d16e7879959c00 not found: ID does not exist" Apr 17 17:42:49.762014 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.761989 2565 scope.go:117] "RemoveContainer" containerID="bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a" Apr 17 17:42:49.762225 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:42:49.762209 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a\": container with ID starting with bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a not found: ID does not exist" containerID="bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a" Apr 17 17:42:49.762265 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.762228 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a"} err="failed to get container status \"bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a\": rpc error: code = NotFound desc = could not find container \"bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a\": container with ID starting with bd5f0bab979bdc4463a707cda9b01213454507b1c9c19f5e01e51b79b7540e4a not found: ID does not exist" Apr 17 17:42:49.762265 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.762250 2565 scope.go:117] "RemoveContainer" containerID="a7e1b9f17afaa4505fafc81396cfe1da29895f06b15a77d21077837de2f329a3" Apr 17 17:42:49.762470 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:42:49.762453 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e1b9f17afaa4505fafc81396cfe1da29895f06b15a77d21077837de2f329a3\": container with ID starting with a7e1b9f17afaa4505fafc81396cfe1da29895f06b15a77d21077837de2f329a3 not found: ID does not exist" containerID="a7e1b9f17afaa4505fafc81396cfe1da29895f06b15a77d21077837de2f329a3" Apr 17 17:42:49.762522 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.762475 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e1b9f17afaa4505fafc81396cfe1da29895f06b15a77d21077837de2f329a3"} err="failed to get container status \"a7e1b9f17afaa4505fafc81396cfe1da29895f06b15a77d21077837de2f329a3\": rpc error: code = NotFound desc = could not find container \"a7e1b9f17afaa4505fafc81396cfe1da29895f06b15a77d21077837de2f329a3\": container with ID starting with a7e1b9f17afaa4505fafc81396cfe1da29895f06b15a77d21077837de2f329a3 not found: ID does not exist" Apr 17 17:42:49.821941 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.821903 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-7674b4456b-9l4k4"] Apr 17 17:42:49.822335 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.822321 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerName="storage-initializer" Apr 17 17:42:49.822378 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.822337 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerName="storage-initializer" Apr 17 17:42:49.822378 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.822356 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerName="tokenizer" Apr 17 17:42:49.822378 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.822362 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerName="tokenizer" Apr 17 17:42:49.822469 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.822383 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerName="main" Apr 17 17:42:49.822469 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.822391 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerName="main" Apr 17 17:42:49.822469 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.822450 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerName="main" Apr 17 17:42:49.822469 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.822461 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" containerName="tokenizer" Apr 17 17:42:49.827015 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.826994 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" Apr 17 17:42:49.829737 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.829717 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 17:42:49.829981 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.829962 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-drg5n\"" Apr 17 17:42:49.837086 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.837060 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7674b4456b-9l4k4"] Apr 17 17:42:49.899237 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.899200 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt2j4\" (UniqueName: \"kubernetes.io/projected/4d80993d-a624-4870-bd9e-9b65f601f330-kube-api-access-qt2j4\") pod \"llmisvc-controller-manager-7674b4456b-9l4k4\" (UID: \"4d80993d-a624-4870-bd9e-9b65f601f330\") " pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" Apr 17 17:42:49.899403 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:49.899253 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d80993d-a624-4870-bd9e-9b65f601f330-cert\") pod \"llmisvc-controller-manager-7674b4456b-9l4k4\" (UID: \"4d80993d-a624-4870-bd9e-9b65f601f330\") " pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" Apr 17 17:42:50.000155 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:50.000061 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt2j4\" (UniqueName: \"kubernetes.io/projected/4d80993d-a624-4870-bd9e-9b65f601f330-kube-api-access-qt2j4\") pod \"llmisvc-controller-manager-7674b4456b-9l4k4\" (UID: \"4d80993d-a624-4870-bd9e-9b65f601f330\") " pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" Apr 17 17:42:50.000155 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:50.000123 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d80993d-a624-4870-bd9e-9b65f601f330-cert\") pod \"llmisvc-controller-manager-7674b4456b-9l4k4\" (UID: \"4d80993d-a624-4870-bd9e-9b65f601f330\") " pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" Apr 17 17:42:50.002526 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:50.002501 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d80993d-a624-4870-bd9e-9b65f601f330-cert\") pod \"llmisvc-controller-manager-7674b4456b-9l4k4\" (UID: \"4d80993d-a624-4870-bd9e-9b65f601f330\") " pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" Apr 17 17:42:50.008473 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:50.008446 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt2j4\" (UniqueName: \"kubernetes.io/projected/4d80993d-a624-4870-bd9e-9b65f601f330-kube-api-access-qt2j4\") pod \"llmisvc-controller-manager-7674b4456b-9l4k4\" (UID: \"4d80993d-a624-4870-bd9e-9b65f601f330\") " pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" Apr 17 17:42:50.137604 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:50.137560 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" Apr 17 17:42:50.266412 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:50.266380 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7674b4456b-9l4k4"] Apr 17 17:42:50.268341 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:42:50.268293 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4d80993d_a624_4870_bd9e_9b65f601f330.slice/crio-68e54892f6365b4052a81cbf36f42e9870a40e73fe9d8399ebb68a318dd481bd WatchSource:0}: Error finding container 68e54892f6365b4052a81cbf36f42e9870a40e73fe9d8399ebb68a318dd481bd: Status 404 returned error can't find the container with id 68e54892f6365b4052a81cbf36f42e9870a40e73fe9d8399ebb68a318dd481bd Apr 17 17:42:50.742280 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:50.742247 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" event={"ID":"4d80993d-a624-4870-bd9e-9b65f601f330","Type":"ContainerStarted","Data":"68e54892f6365b4052a81cbf36f42e9870a40e73fe9d8399ebb68a318dd481bd"} Apr 17 17:42:51.197428 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:51.197389 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2d3473-0ad5-4c98-9186-f2951aa434b3" path="/var/lib/kubelet/pods/ad2d3473-0ad5-4c98-9186-f2951aa434b3/volumes" Apr 17 17:42:53.755250 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:53.755222 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" event={"ID":"4d80993d-a624-4870-bd9e-9b65f601f330","Type":"ContainerStarted","Data":"e160ac4b476fc1e87ccc084d22ceabd75a55d3a03128a5f4fc39f0acecf5fd28"} Apr 17 17:42:53.755618 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:53.755274 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" Apr 17 17:42:53.775577 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:42:53.775533 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" podStartSLOduration=1.364240457 podStartE2EDuration="4.775515711s" podCreationTimestamp="2026-04-17 17:42:49 +0000 UTC" firstStartedPulling="2026-04-17 17:42:50.269573373 +0000 UTC m=+1119.602190997" lastFinishedPulling="2026-04-17 17:42:53.680848613 +0000 UTC m=+1123.013466251" observedRunningTime="2026-04-17 17:42:53.772136571 +0000 UTC m=+1123.104754216" watchObservedRunningTime="2026-04-17 17:42:53.775515711 +0000 UTC m=+1123.108133357" Apr 17 17:43:24.765731 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:43:24.765651 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-7674b4456b-9l4k4" Apr 17 17:44:11.219250 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:44:11.219218 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:44:11.226324 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:44:11.226301 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:47:19.440757 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.440718 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6"] Apr 17 17:47:19.444653 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.444627 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.447541 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.447514 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 17 17:47:19.448437 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.448413 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:47:19.448553 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.448472 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:47:19.448553 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.448486 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:47:19.448553 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.448486 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-bv5r5\"" Apr 17 17:47:19.456082 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.456061 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6"] Apr 17 17:47:19.564954 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.564920 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.565159 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.564968 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a922c097-3648-48fd-9e20-b2760e0d52b8-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.565159 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.565066 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.565159 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.565110 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.565300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.565205 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.565300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.565251 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r2kp\" (UniqueName: \"kubernetes.io/projected/a922c097-3648-48fd-9e20-b2760e0d52b8-kube-api-access-9r2kp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.666567 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.666532 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.666742 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.666580 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r2kp\" (UniqueName: \"kubernetes.io/projected/a922c097-3648-48fd-9e20-b2760e0d52b8-kube-api-access-9r2kp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.666742 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.666617 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.666742 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.666644 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a922c097-3648-48fd-9e20-b2760e0d52b8-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.666742 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.666701 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.666973 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.666747 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.667021 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.666990 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.667062 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.667019 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.667097 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.667082 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.667172 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.667155 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.669208 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.669186 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a922c097-3648-48fd-9e20-b2760e0d52b8-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.675351 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.675320 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r2kp\" (UniqueName: \"kubernetes.io/projected/a922c097-3648-48fd-9e20-b2760e0d52b8-kube-api-access-9r2kp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.756162 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.756076 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:19.893708 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.893682 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6"] Apr 17 17:47:19.896022 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:47:19.895982 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda922c097_3648_48fd_9e20_b2760e0d52b8.slice/crio-d73992e304bf81e6d080cef827c450a44000d8558ee494e84666125e03c2e15f WatchSource:0}: Error finding container d73992e304bf81e6d080cef827c450a44000d8558ee494e84666125e03c2e15f: Status 404 returned error can't find the container with id d73992e304bf81e6d080cef827c450a44000d8558ee494e84666125e03c2e15f Apr 17 17:47:19.898260 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:19.898241 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:47:20.810410 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:20.810375 2565 generic.go:358] "Generic (PLEG): container finished" podID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerID="3af0eb5ac067f6078af9c4e4578bbd0f4748fb7e7e51c3b8e524cad69d752252" exitCode=0 Apr 17 17:47:20.810759 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:20.810425 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" event={"ID":"a922c097-3648-48fd-9e20-b2760e0d52b8","Type":"ContainerDied","Data":"3af0eb5ac067f6078af9c4e4578bbd0f4748fb7e7e51c3b8e524cad69d752252"} Apr 17 17:47:20.810759 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:20.810446 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" event={"ID":"a922c097-3648-48fd-9e20-b2760e0d52b8","Type":"ContainerStarted","Data":"d73992e304bf81e6d080cef827c450a44000d8558ee494e84666125e03c2e15f"} Apr 17 17:47:21.816808 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:21.816772 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" event={"ID":"a922c097-3648-48fd-9e20-b2760e0d52b8","Type":"ContainerStarted","Data":"e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c"} Apr 17 17:47:21.816808 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:21.816815 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" event={"ID":"a922c097-3648-48fd-9e20-b2760e0d52b8","Type":"ContainerStarted","Data":"e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533"} Apr 17 17:47:21.817256 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:21.816915 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:21.838073 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:21.838027 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" podStartSLOduration=2.838012713 podStartE2EDuration="2.838012713s" podCreationTimestamp="2026-04-17 17:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:47:21.835566308 +0000 UTC m=+1391.168183955" watchObservedRunningTime="2026-04-17 17:47:21.838012713 +0000 UTC m=+1391.170630359" Apr 17 17:47:29.756909 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:29.756868 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:29.757417 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:29.756920 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:29.759722 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:29.759695 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:29.857855 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:29.857806 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:47:50.862495 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:47:50.862462 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:49:11.253452 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:49:11.253423 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:49:11.276596 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:49:11.276573 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:50:04.981445 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:04.981413 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6"] Apr 17 17:50:04.981986 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:04.981685 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" podUID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerName="main" containerID="cri-o://e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533" gracePeriod=30 Apr 17 17:50:04.981986 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:04.981744 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" podUID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerName="tokenizer" containerID="cri-o://e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c" gracePeriod=30 Apr 17 17:50:05.465293 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:05.465259 2565 generic.go:358] "Generic (PLEG): container finished" podID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerID="e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533" exitCode=0 Apr 17 17:50:05.465478 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:05.465331 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" event={"ID":"a922c097-3648-48fd-9e20-b2760e0d52b8","Type":"ContainerDied","Data":"e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533"} Apr 17 17:50:06.343430 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.343405 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:50:06.472772 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.472688 2565 generic.go:358] "Generic (PLEG): container finished" podID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerID="e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c" exitCode=0 Apr 17 17:50:06.472945 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.472774 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" event={"ID":"a922c097-3648-48fd-9e20-b2760e0d52b8","Type":"ContainerDied","Data":"e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c"} Apr 17 17:50:06.472945 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.472789 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" Apr 17 17:50:06.472945 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.472826 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6" event={"ID":"a922c097-3648-48fd-9e20-b2760e0d52b8","Type":"ContainerDied","Data":"d73992e304bf81e6d080cef827c450a44000d8558ee494e84666125e03c2e15f"} Apr 17 17:50:06.472945 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.472864 2565 scope.go:117] "RemoveContainer" containerID="e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c" Apr 17 17:50:06.481169 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.481142 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-kserve-provision-location\") pod \"a922c097-3648-48fd-9e20-b2760e0d52b8\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " Apr 17 17:50:06.481330 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.481213 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-tmp\") pod \"a922c097-3648-48fd-9e20-b2760e0d52b8\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " Apr 17 17:50:06.481330 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.481256 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a922c097-3648-48fd-9e20-b2760e0d52b8-tls-certs\") pod \"a922c097-3648-48fd-9e20-b2760e0d52b8\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " Apr 17 17:50:06.481330 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.481285 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r2kp\" (UniqueName: \"kubernetes.io/projected/a922c097-3648-48fd-9e20-b2760e0d52b8-kube-api-access-9r2kp\") pod \"a922c097-3648-48fd-9e20-b2760e0d52b8\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " Apr 17 17:50:06.481492 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.481336 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-uds\") pod \"a922c097-3648-48fd-9e20-b2760e0d52b8\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " Apr 17 17:50:06.481492 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.481358 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-cache\") pod \"a922c097-3648-48fd-9e20-b2760e0d52b8\" (UID: \"a922c097-3648-48fd-9e20-b2760e0d52b8\") " Apr 17 17:50:06.481611 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.481583 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a922c097-3648-48fd-9e20-b2760e0d52b8" (UID: "a922c097-3648-48fd-9e20-b2760e0d52b8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:06.481671 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.481592 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a922c097-3648-48fd-9e20-b2760e0d52b8" (UID: "a922c097-3648-48fd-9e20-b2760e0d52b8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:06.481729 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.481672 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-uds\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:50:06.481874 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.481834 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a922c097-3648-48fd-9e20-b2760e0d52b8" (UID: "a922c097-3648-48fd-9e20-b2760e0d52b8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:06.482060 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.482040 2565 scope.go:117] "RemoveContainer" containerID="e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533" Apr 17 17:50:06.482199 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.482175 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a922c097-3648-48fd-9e20-b2760e0d52b8" (UID: "a922c097-3648-48fd-9e20-b2760e0d52b8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:06.483632 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.483609 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a922c097-3648-48fd-9e20-b2760e0d52b8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a922c097-3648-48fd-9e20-b2760e0d52b8" (UID: "a922c097-3648-48fd-9e20-b2760e0d52b8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:50:06.483714 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.483658 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a922c097-3648-48fd-9e20-b2760e0d52b8-kube-api-access-9r2kp" (OuterVolumeSpecName: "kube-api-access-9r2kp") pod "a922c097-3648-48fd-9e20-b2760e0d52b8" (UID: "a922c097-3648-48fd-9e20-b2760e0d52b8"). InnerVolumeSpecName "kube-api-access-9r2kp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:50:06.503137 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.503119 2565 scope.go:117] "RemoveContainer" containerID="3af0eb5ac067f6078af9c4e4578bbd0f4748fb7e7e51c3b8e524cad69d752252" Apr 17 17:50:06.511493 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.511475 2565 scope.go:117] "RemoveContainer" containerID="e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c" Apr 17 17:50:06.511742 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:50:06.511722 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c\": container with ID starting with e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c not found: ID does not exist" containerID="e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c" Apr 17 17:50:06.511795 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.511753 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c"} err="failed to get container status \"e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c\": rpc error: code = NotFound desc = could not find container \"e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c\": container with ID starting with e826d1007590acbbad08b0da7ed33a4ecc9768c174b395a07d35c2cd11e21e5c not found: ID does not exist" Apr 17 17:50:06.511795 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.511777 2565 scope.go:117] "RemoveContainer" containerID="e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533" Apr 17 17:50:06.512035 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:50:06.512016 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533\": container with ID starting with e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533 not found: ID does not exist" containerID="e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533" Apr 17 17:50:06.512096 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.512045 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533"} err="failed to get container status \"e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533\": rpc error: code = NotFound desc = could not find container \"e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533\": container with ID starting with e5887cbd5bbfef2721b7819eea0be2ef66832983d9515d4fbcba5719dba6b533 not found: ID does not exist" Apr 17 17:50:06.512096 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.512067 2565 scope.go:117] "RemoveContainer" containerID="3af0eb5ac067f6078af9c4e4578bbd0f4748fb7e7e51c3b8e524cad69d752252" Apr 17 17:50:06.512307 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:50:06.512291 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3af0eb5ac067f6078af9c4e4578bbd0f4748fb7e7e51c3b8e524cad69d752252\": container with ID starting with 3af0eb5ac067f6078af9c4e4578bbd0f4748fb7e7e51c3b8e524cad69d752252 not found: ID does not exist" containerID="3af0eb5ac067f6078af9c4e4578bbd0f4748fb7e7e51c3b8e524cad69d752252" Apr 17 17:50:06.512349 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.512311 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3af0eb5ac067f6078af9c4e4578bbd0f4748fb7e7e51c3b8e524cad69d752252"} err="failed to get container status \"3af0eb5ac067f6078af9c4e4578bbd0f4748fb7e7e51c3b8e524cad69d752252\": rpc error: code = NotFound desc = could not find container \"3af0eb5ac067f6078af9c4e4578bbd0f4748fb7e7e51c3b8e524cad69d752252\": container with ID starting with 3af0eb5ac067f6078af9c4e4578bbd0f4748fb7e7e51c3b8e524cad69d752252 not found: ID does not exist" Apr 17 17:50:06.582310 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.582263 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-tmp\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:50:06.582310 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.582293 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a922c097-3648-48fd-9e20-b2760e0d52b8-tls-certs\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:50:06.582310 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.582304 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9r2kp\" (UniqueName: \"kubernetes.io/projected/a922c097-3648-48fd-9e20-b2760e0d52b8-kube-api-access-9r2kp\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:50:06.582310 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.582314 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-tokenizer-cache\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:50:06.582310 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.582323 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a922c097-3648-48fd-9e20-b2760e0d52b8-kserve-provision-location\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:50:06.796982 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.796949 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6"] Apr 17 17:50:06.801233 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:06.801208 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schexl9t6"] Apr 17 17:50:07.194323 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:07.194289 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a922c097-3648-48fd-9e20-b2760e0d52b8" path="/var/lib/kubelet/pods/a922c097-3648-48fd-9e20-b2760e0d52b8/volumes" Apr 17 17:50:39.660926 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.660886 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh"] Apr 17 17:50:39.661456 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.661294 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerName="main" Apr 17 17:50:39.661456 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.661304 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerName="main" Apr 17 17:50:39.661456 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.661320 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerName="tokenizer" Apr 17 17:50:39.661456 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.661325 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerName="tokenizer" Apr 17 17:50:39.661456 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.661340 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerName="storage-initializer" Apr 17 17:50:39.661456 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.661346 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerName="storage-initializer" Apr 17 17:50:39.661456 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.661404 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerName="main" Apr 17 17:50:39.661456 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.661416 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="a922c097-3648-48fd-9e20-b2760e0d52b8" containerName="tokenizer" Apr 17 17:50:39.664720 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.664689 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.668785 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.668751 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:50:39.668940 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.668796 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:50:39.668940 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.668757 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:50:39.668940 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.668887 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-c927h\"" Apr 17 17:50:39.668940 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.668758 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 17 17:50:39.679536 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.679508 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh"] Apr 17 17:50:39.805502 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.805460 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.805704 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.805514 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.805704 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.805565 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.805704 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.805603 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a05fd06-2ab9-4300-8966-16cd95842b7b-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.805704 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.805651 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49gmn\" (UniqueName: \"kubernetes.io/projected/2a05fd06-2ab9-4300-8966-16cd95842b7b-kube-api-access-49gmn\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.805914 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.805726 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.907055 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.907015 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.907267 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.907092 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.907267 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.907132 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.907267 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.907182 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.907267 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.907212 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a05fd06-2ab9-4300-8966-16cd95842b7b-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.907531 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.907269 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49gmn\" (UniqueName: \"kubernetes.io/projected/2a05fd06-2ab9-4300-8966-16cd95842b7b-kube-api-access-49gmn\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.907531 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.907495 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.907638 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.907538 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.907638 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.907602 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.907748 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.907692 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.910012 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.909989 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a05fd06-2ab9-4300-8966-16cd95842b7b-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.915355 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.915296 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49gmn\" (UniqueName: \"kubernetes.io/projected/2a05fd06-2ab9-4300-8966-16cd95842b7b-kube-api-access-49gmn\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:39.978066 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:39.978037 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:40.112955 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:40.112929 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh"] Apr 17 17:50:40.114640 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:50:40.114602 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a05fd06_2ab9_4300_8966_16cd95842b7b.slice/crio-6b43dab99a0dc074049905f451780ce49f7e528a88b2908b5c8e819650a54848 WatchSource:0}: Error finding container 6b43dab99a0dc074049905f451780ce49f7e528a88b2908b5c8e819650a54848: Status 404 returned error can't find the container with id 6b43dab99a0dc074049905f451780ce49f7e528a88b2908b5c8e819650a54848 Apr 17 17:50:40.609743 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:40.609701 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" event={"ID":"2a05fd06-2ab9-4300-8966-16cd95842b7b","Type":"ContainerStarted","Data":"8f105b9fa8aad53786b9d8bf96f92cdfb28d170cc595b9bb7403cdd810f7d2a0"} Apr 17 17:50:40.609958 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:40.609750 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" event={"ID":"2a05fd06-2ab9-4300-8966-16cd95842b7b","Type":"ContainerStarted","Data":"6b43dab99a0dc074049905f451780ce49f7e528a88b2908b5c8e819650a54848"} Apr 17 17:50:41.615430 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:41.615393 2565 generic.go:358] "Generic (PLEG): container finished" podID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerID="8f105b9fa8aad53786b9d8bf96f92cdfb28d170cc595b9bb7403cdd810f7d2a0" exitCode=0 Apr 17 17:50:41.615813 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:41.615463 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" event={"ID":"2a05fd06-2ab9-4300-8966-16cd95842b7b","Type":"ContainerDied","Data":"8f105b9fa8aad53786b9d8bf96f92cdfb28d170cc595b9bb7403cdd810f7d2a0"} Apr 17 17:50:42.621219 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:42.621183 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" event={"ID":"2a05fd06-2ab9-4300-8966-16cd95842b7b","Type":"ContainerStarted","Data":"ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828"} Apr 17 17:50:42.621627 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:42.621227 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" event={"ID":"2a05fd06-2ab9-4300-8966-16cd95842b7b","Type":"ContainerStarted","Data":"c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885"} Apr 17 17:50:42.621627 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:42.621275 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:42.644834 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:42.644782 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" podStartSLOduration=3.644764213 podStartE2EDuration="3.644764213s" podCreationTimestamp="2026-04-17 17:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:50:42.643217263 +0000 UTC m=+1591.975834917" watchObservedRunningTime="2026-04-17 17:50:42.644764213 +0000 UTC m=+1591.977381859" Apr 17 17:50:49.978774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:49.978738 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:49.978774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:49.978785 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:49.981733 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:49.981713 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:50:50.653773 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:50:50.653741 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:51:11.658904 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:11.658870 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:51:32.410123 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:32.410082 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh"] Apr 17 17:51:32.410515 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:32.410488 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" podUID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerName="main" containerID="cri-o://c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885" gracePeriod=30 Apr 17 17:51:32.410633 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:32.410574 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" podUID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerName="tokenizer" containerID="cri-o://ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828" gracePeriod=30 Apr 17 17:51:32.833186 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:32.833154 2565 generic.go:358] "Generic (PLEG): container finished" podID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerID="c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885" exitCode=0 Apr 17 17:51:32.833343 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:32.833234 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" event={"ID":"2a05fd06-2ab9-4300-8966-16cd95842b7b","Type":"ContainerDied","Data":"c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885"} Apr 17 17:51:33.658885 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.658829 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:51:33.783051 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.782966 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-cache\") pod \"2a05fd06-2ab9-4300-8966-16cd95842b7b\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " Apr 17 17:51:33.783051 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.783022 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-tmp\") pod \"2a05fd06-2ab9-4300-8966-16cd95842b7b\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " Apr 17 17:51:33.783051 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.783052 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-kserve-provision-location\") pod \"2a05fd06-2ab9-4300-8966-16cd95842b7b\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " Apr 17 17:51:33.783339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.783087 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-uds\") pod \"2a05fd06-2ab9-4300-8966-16cd95842b7b\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " Apr 17 17:51:33.783339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.783112 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a05fd06-2ab9-4300-8966-16cd95842b7b-tls-certs\") pod \"2a05fd06-2ab9-4300-8966-16cd95842b7b\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " Apr 17 17:51:33.783339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.783187 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49gmn\" (UniqueName: \"kubernetes.io/projected/2a05fd06-2ab9-4300-8966-16cd95842b7b-kube-api-access-49gmn\") pod \"2a05fd06-2ab9-4300-8966-16cd95842b7b\" (UID: \"2a05fd06-2ab9-4300-8966-16cd95842b7b\") " Apr 17 17:51:33.783339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.783293 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2a05fd06-2ab9-4300-8966-16cd95842b7b" (UID: "2a05fd06-2ab9-4300-8966-16cd95842b7b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:33.783339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.783327 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2a05fd06-2ab9-4300-8966-16cd95842b7b" (UID: "2a05fd06-2ab9-4300-8966-16cd95842b7b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:33.783591 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.783383 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2a05fd06-2ab9-4300-8966-16cd95842b7b" (UID: "2a05fd06-2ab9-4300-8966-16cd95842b7b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:33.783591 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.783498 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-cache\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:51:33.783591 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.783512 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-tmp\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:51:33.783591 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.783522 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-tokenizer-uds\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:51:33.783825 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.783804 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2a05fd06-2ab9-4300-8966-16cd95842b7b" (UID: "2a05fd06-2ab9-4300-8966-16cd95842b7b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:33.785339 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.785313 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a05fd06-2ab9-4300-8966-16cd95842b7b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2a05fd06-2ab9-4300-8966-16cd95842b7b" (UID: "2a05fd06-2ab9-4300-8966-16cd95842b7b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:51:33.785422 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.785321 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a05fd06-2ab9-4300-8966-16cd95842b7b-kube-api-access-49gmn" (OuterVolumeSpecName: "kube-api-access-49gmn") pod "2a05fd06-2ab9-4300-8966-16cd95842b7b" (UID: "2a05fd06-2ab9-4300-8966-16cd95842b7b"). InnerVolumeSpecName "kube-api-access-49gmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:51:33.838679 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.838644 2565 generic.go:358] "Generic (PLEG): container finished" podID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerID="ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828" exitCode=0 Apr 17 17:51:33.838903 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.838713 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" Apr 17 17:51:33.838903 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.838724 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" event={"ID":"2a05fd06-2ab9-4300-8966-16cd95842b7b","Type":"ContainerDied","Data":"ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828"} Apr 17 17:51:33.838903 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.838765 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh" event={"ID":"2a05fd06-2ab9-4300-8966-16cd95842b7b","Type":"ContainerDied","Data":"6b43dab99a0dc074049905f451780ce49f7e528a88b2908b5c8e819650a54848"} Apr 17 17:51:33.838903 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.838791 2565 scope.go:117] "RemoveContainer" containerID="ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828" Apr 17 17:51:33.848260 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.848241 2565 scope.go:117] "RemoveContainer" containerID="c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885" Apr 17 17:51:33.856590 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.856569 2565 scope.go:117] "RemoveContainer" containerID="8f105b9fa8aad53786b9d8bf96f92cdfb28d170cc595b9bb7403cdd810f7d2a0" Apr 17 17:51:33.863907 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.863880 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh"] Apr 17 17:51:33.865407 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.865393 2565 scope.go:117] "RemoveContainer" containerID="ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828" Apr 17 17:51:33.865686 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:51:33.865668 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828\": container with ID starting with ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828 not found: ID does not exist" containerID="ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828" Apr 17 17:51:33.865734 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.865696 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828"} err="failed to get container status \"ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828\": rpc error: code = NotFound desc = could not find container \"ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828\": container with ID starting with ad7dab980fc983facc40cfe61f99cacc858772da513a5d0eaf2df25269f67828 not found: ID does not exist" Apr 17 17:51:33.865734 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.865713 2565 scope.go:117] "RemoveContainer" containerID="c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885" Apr 17 17:51:33.865982 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:51:33.865960 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885\": container with ID starting with c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885 not found: ID does not exist" containerID="c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885" Apr 17 17:51:33.866050 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.865991 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885"} err="failed to get container status \"c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885\": rpc error: code = NotFound desc = could not find container \"c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885\": container with ID starting with c14e696b99a9866fd51d47d6fb47dad68bdea98e814afb6074db9ff2a3eb4885 not found: ID does not exist" Apr 17 17:51:33.866050 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.866008 2565 scope.go:117] "RemoveContainer" containerID="8f105b9fa8aad53786b9d8bf96f92cdfb28d170cc595b9bb7403cdd810f7d2a0" Apr 17 17:51:33.866249 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:51:33.866231 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f105b9fa8aad53786b9d8bf96f92cdfb28d170cc595b9bb7403cdd810f7d2a0\": container with ID starting with 8f105b9fa8aad53786b9d8bf96f92cdfb28d170cc595b9bb7403cdd810f7d2a0 not found: ID does not exist" containerID="8f105b9fa8aad53786b9d8bf96f92cdfb28d170cc595b9bb7403cdd810f7d2a0" Apr 17 17:51:33.866300 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.866254 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f105b9fa8aad53786b9d8bf96f92cdfb28d170cc595b9bb7403cdd810f7d2a0"} err="failed to get container status \"8f105b9fa8aad53786b9d8bf96f92cdfb28d170cc595b9bb7403cdd810f7d2a0\": rpc error: code = NotFound desc = could not find container \"8f105b9fa8aad53786b9d8bf96f92cdfb28d170cc595b9bb7403cdd810f7d2a0\": container with ID starting with 8f105b9fa8aad53786b9d8bf96f92cdfb28d170cc595b9bb7403cdd810f7d2a0 not found: ID does not exist" Apr 17 17:51:33.870891 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.870867 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5bb95kbbrh"] Apr 17 17:51:33.884288 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.884262 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a05fd06-2ab9-4300-8966-16cd95842b7b-kserve-provision-location\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:51:33.884288 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.884289 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a05fd06-2ab9-4300-8966-16cd95842b7b-tls-certs\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:51:33.884435 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:33.884300 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49gmn\" (UniqueName: \"kubernetes.io/projected/2a05fd06-2ab9-4300-8966-16cd95842b7b-kube-api-access-49gmn\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:51:35.194782 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:51:35.194749 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a05fd06-2ab9-4300-8966-16cd95842b7b" path="/var/lib/kubelet/pods/2a05fd06-2ab9-4300-8966-16cd95842b7b/volumes" Apr 17 17:54:11.294629 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:54:11.294600 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:54:11.311732 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:54:11.311706 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:57:19.805567 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.805484 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vdf8j/must-gather-cp77k"] Apr 17 17:57:19.808038 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.805894 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerName="storage-initializer" Apr 17 17:57:19.808038 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.805905 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerName="storage-initializer" Apr 17 17:57:19.808038 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.805916 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerName="main" Apr 17 17:57:19.808038 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.805922 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerName="main" Apr 17 17:57:19.808038 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.805934 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerName="tokenizer" Apr 17 17:57:19.808038 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.805940 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerName="tokenizer" Apr 17 17:57:19.808038 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.806008 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerName="tokenizer" Apr 17 17:57:19.808038 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.806016 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a05fd06-2ab9-4300-8966-16cd95842b7b" containerName="main" Apr 17 17:57:19.808992 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.808974 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdf8j/must-gather-cp77k" Apr 17 17:57:19.830094 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.830069 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vdf8j\"/\"default-dockercfg-fwbkb\"" Apr 17 17:57:19.830267 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.830105 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vdf8j\"/\"kube-root-ca.crt\"" Apr 17 17:57:19.832248 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.832227 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vdf8j\"/\"openshift-service-ca.crt\"" Apr 17 17:57:19.866860 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.866798 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vdf8j/must-gather-cp77k"] Apr 17 17:57:19.960252 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.960218 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cb4abef8-ded0-4f91-8c1c-1afbc875ec40-must-gather-output\") pod \"must-gather-cp77k\" (UID: \"cb4abef8-ded0-4f91-8c1c-1afbc875ec40\") " pod="openshift-must-gather-vdf8j/must-gather-cp77k" Apr 17 17:57:19.960434 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:19.960347 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjqpw\" (UniqueName: \"kubernetes.io/projected/cb4abef8-ded0-4f91-8c1c-1afbc875ec40-kube-api-access-fjqpw\") pod \"must-gather-cp77k\" (UID: \"cb4abef8-ded0-4f91-8c1c-1afbc875ec40\") " pod="openshift-must-gather-vdf8j/must-gather-cp77k" Apr 17 17:57:20.061385 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:20.061283 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cb4abef8-ded0-4f91-8c1c-1afbc875ec40-must-gather-output\") pod \"must-gather-cp77k\" (UID: \"cb4abef8-ded0-4f91-8c1c-1afbc875ec40\") " pod="openshift-must-gather-vdf8j/must-gather-cp77k" Apr 17 17:57:20.061532 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:20.061431 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjqpw\" (UniqueName: \"kubernetes.io/projected/cb4abef8-ded0-4f91-8c1c-1afbc875ec40-kube-api-access-fjqpw\") pod \"must-gather-cp77k\" (UID: \"cb4abef8-ded0-4f91-8c1c-1afbc875ec40\") " pod="openshift-must-gather-vdf8j/must-gather-cp77k" Apr 17 17:57:20.061675 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:20.061650 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cb4abef8-ded0-4f91-8c1c-1afbc875ec40-must-gather-output\") pod \"must-gather-cp77k\" (UID: \"cb4abef8-ded0-4f91-8c1c-1afbc875ec40\") " pod="openshift-must-gather-vdf8j/must-gather-cp77k" Apr 17 17:57:20.072774 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:20.072746 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjqpw\" (UniqueName: \"kubernetes.io/projected/cb4abef8-ded0-4f91-8c1c-1afbc875ec40-kube-api-access-fjqpw\") pod \"must-gather-cp77k\" (UID: \"cb4abef8-ded0-4f91-8c1c-1afbc875ec40\") " pod="openshift-must-gather-vdf8j/must-gather-cp77k" Apr 17 17:57:20.118264 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:20.118229 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdf8j/must-gather-cp77k" Apr 17 17:57:20.255694 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:20.255670 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vdf8j/must-gather-cp77k"] Apr 17 17:57:20.257445 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:57:20.257416 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb4abef8_ded0_4f91_8c1c_1afbc875ec40.slice/crio-adec294acebf6183e653cb559a529aa97b0688d5518048f4b176eb3d03b0e251 WatchSource:0}: Error finding container adec294acebf6183e653cb559a529aa97b0688d5518048f4b176eb3d03b0e251: Status 404 returned error can't find the container with id adec294acebf6183e653cb559a529aa97b0688d5518048f4b176eb3d03b0e251 Apr 17 17:57:20.259131 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:20.259114 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:57:21.208374 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:21.208320 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdf8j/must-gather-cp77k" event={"ID":"cb4abef8-ded0-4f91-8c1c-1afbc875ec40","Type":"ContainerStarted","Data":"adec294acebf6183e653cb559a529aa97b0688d5518048f4b176eb3d03b0e251"} Apr 17 17:57:25.230219 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:25.230117 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdf8j/must-gather-cp77k" event={"ID":"cb4abef8-ded0-4f91-8c1c-1afbc875ec40","Type":"ContainerStarted","Data":"1eff1c622d7ac290c309a1ccbfcba32f6a82f4ff04b551421015639b64c0afa3"} Apr 17 17:57:25.230219 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:25.230176 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdf8j/must-gather-cp77k" event={"ID":"cb4abef8-ded0-4f91-8c1c-1afbc875ec40","Type":"ContainerStarted","Data":"c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90"} Apr 17 17:57:25.256234 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:25.256174 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vdf8j/must-gather-cp77k" podStartSLOduration=1.824912782 podStartE2EDuration="6.256158486s" podCreationTimestamp="2026-04-17 17:57:19 +0000 UTC" firstStartedPulling="2026-04-17 17:57:20.259236662 +0000 UTC m=+1989.591854286" lastFinishedPulling="2026-04-17 17:57:24.690482366 +0000 UTC m=+1994.023099990" observedRunningTime="2026-04-17 17:57:25.254211027 +0000 UTC m=+1994.586828673" watchObservedRunningTime="2026-04-17 17:57:25.256158486 +0000 UTC m=+1994.588776132" Apr 17 17:57:49.776674 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:49.776642 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-564dfc4b58-br5kx_fba1d80e-239c-4b35-afc0-5052c340a9ee/router/0.log" Apr 17 17:57:50.725951 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:50.725919 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-564dfc4b58-br5kx_fba1d80e-239c-4b35-afc0-5052c340a9ee/router/0.log" Apr 17 17:57:51.568075 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:51.568045 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-97cwd_c417a2cc-773d-4c5c-92c9-c3f262edf566/authorino/0.log" Apr 17 17:57:51.636930 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:51.636906 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-r2tw7_cbcc1b92-0ae7-45c5-9394-e10768ea865e/kuadrant-console-plugin/0.log" Apr 17 17:57:53.353659 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:53.353621 2565 generic.go:358] "Generic (PLEG): container finished" podID="cb4abef8-ded0-4f91-8c1c-1afbc875ec40" containerID="c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90" exitCode=0 Apr 17 17:57:53.354096 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:53.353696 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdf8j/must-gather-cp77k" event={"ID":"cb4abef8-ded0-4f91-8c1c-1afbc875ec40","Type":"ContainerDied","Data":"c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90"} Apr 17 17:57:53.354096 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:53.354040 2565 scope.go:117] "RemoveContainer" containerID="c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90" Apr 17 17:57:53.407444 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:53.407416 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vdf8j_must-gather-cp77k_cb4abef8-ded0-4f91-8c1c-1afbc875ec40/gather/0.log" Apr 17 17:57:56.953999 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:56.953965 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bf9w9_7b024619-6a7c-49fc-b69c-a21fab2b9f5c/global-pull-secret-syncer/0.log" Apr 17 17:57:57.036357 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:57.036328 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-gnmlr_cf0e9c4d-2f86-4ecc-8ccf-c91544212af9/konnectivity-agent/0.log" Apr 17 17:57:57.220129 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:57.220043 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-59.ec2.internal_38e6c6efcebb2223fd3833916862d318/haproxy/0.log" Apr 17 17:57:58.922063 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:58.922028 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vdf8j/must-gather-cp77k"] Apr 17 17:57:58.922465 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:58.922277 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-vdf8j/must-gather-cp77k" podUID="cb4abef8-ded0-4f91-8c1c-1afbc875ec40" containerName="copy" containerID="cri-o://1eff1c622d7ac290c309a1ccbfcba32f6a82f4ff04b551421015639b64c0afa3" gracePeriod=2 Apr 17 17:57:58.929806 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:58.929779 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vdf8j/must-gather-cp77k"] Apr 17 17:57:59.156886 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.156857 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vdf8j_must-gather-cp77k_cb4abef8-ded0-4f91-8c1c-1afbc875ec40/copy/0.log" Apr 17 17:57:59.157255 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.157238 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdf8j/must-gather-cp77k" Apr 17 17:57:59.159660 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.159637 2565 status_manager.go:895] "Failed to get status for pod" podUID="cb4abef8-ded0-4f91-8c1c-1afbc875ec40" pod="openshift-must-gather-vdf8j/must-gather-cp77k" err="pods \"must-gather-cp77k\" is forbidden: User \"system:node:ip-10-0-143-59.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vdf8j\": no relationship found between node 'ip-10-0-143-59.ec2.internal' and this object" Apr 17 17:57:59.232215 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.232128 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cb4abef8-ded0-4f91-8c1c-1afbc875ec40-must-gather-output\") pod \"cb4abef8-ded0-4f91-8c1c-1afbc875ec40\" (UID: \"cb4abef8-ded0-4f91-8c1c-1afbc875ec40\") " Apr 17 17:57:59.232215 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.232191 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjqpw\" (UniqueName: \"kubernetes.io/projected/cb4abef8-ded0-4f91-8c1c-1afbc875ec40-kube-api-access-fjqpw\") pod \"cb4abef8-ded0-4f91-8c1c-1afbc875ec40\" (UID: \"cb4abef8-ded0-4f91-8c1c-1afbc875ec40\") " Apr 17 17:57:59.234465 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.234444 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4abef8-ded0-4f91-8c1c-1afbc875ec40-kube-api-access-fjqpw" (OuterVolumeSpecName: "kube-api-access-fjqpw") pod "cb4abef8-ded0-4f91-8c1c-1afbc875ec40" (UID: "cb4abef8-ded0-4f91-8c1c-1afbc875ec40"). InnerVolumeSpecName "kube-api-access-fjqpw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:57:59.238125 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.238104 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb4abef8-ded0-4f91-8c1c-1afbc875ec40-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "cb4abef8-ded0-4f91-8c1c-1afbc875ec40" (UID: "cb4abef8-ded0-4f91-8c1c-1afbc875ec40"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:59.333260 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.333223 2565 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cb4abef8-ded0-4f91-8c1c-1afbc875ec40-must-gather-output\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:57:59.333260 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.333262 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fjqpw\" (UniqueName: \"kubernetes.io/projected/cb4abef8-ded0-4f91-8c1c-1afbc875ec40-kube-api-access-fjqpw\") on node \"ip-10-0-143-59.ec2.internal\" DevicePath \"\"" Apr 17 17:57:59.380968 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.380942 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vdf8j_must-gather-cp77k_cb4abef8-ded0-4f91-8c1c-1afbc875ec40/copy/0.log" Apr 17 17:57:59.381248 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.381225 2565 generic.go:358] "Generic (PLEG): container finished" podID="cb4abef8-ded0-4f91-8c1c-1afbc875ec40" containerID="1eff1c622d7ac290c309a1ccbfcba32f6a82f4ff04b551421015639b64c0afa3" exitCode=143 Apr 17 17:57:59.381290 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.381277 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdf8j/must-gather-cp77k" Apr 17 17:57:59.381342 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.381327 2565 scope.go:117] "RemoveContainer" containerID="1eff1c622d7ac290c309a1ccbfcba32f6a82f4ff04b551421015639b64c0afa3" Apr 17 17:57:59.389983 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.389964 2565 scope.go:117] "RemoveContainer" containerID="c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90" Apr 17 17:57:59.402666 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.402642 2565 scope.go:117] "RemoveContainer" containerID="1eff1c622d7ac290c309a1ccbfcba32f6a82f4ff04b551421015639b64c0afa3" Apr 17 17:57:59.402987 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:57:59.402965 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eff1c622d7ac290c309a1ccbfcba32f6a82f4ff04b551421015639b64c0afa3\": container with ID starting with 1eff1c622d7ac290c309a1ccbfcba32f6a82f4ff04b551421015639b64c0afa3 not found: ID does not exist" containerID="1eff1c622d7ac290c309a1ccbfcba32f6a82f4ff04b551421015639b64c0afa3" Apr 17 17:57:59.403074 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.402996 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eff1c622d7ac290c309a1ccbfcba32f6a82f4ff04b551421015639b64c0afa3"} err="failed to get container status \"1eff1c622d7ac290c309a1ccbfcba32f6a82f4ff04b551421015639b64c0afa3\": rpc error: code = NotFound desc = could not find container \"1eff1c622d7ac290c309a1ccbfcba32f6a82f4ff04b551421015639b64c0afa3\": container with ID starting with 1eff1c622d7ac290c309a1ccbfcba32f6a82f4ff04b551421015639b64c0afa3 not found: ID does not exist" Apr 17 17:57:59.403074 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.403016 2565 scope.go:117] "RemoveContainer" containerID="c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90" Apr 17 17:57:59.403234 ip-10-0-143-59 kubenswrapper[2565]: E0417 17:57:59.403217 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90\": container with ID starting with c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90 not found: ID does not exist" containerID="c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90" Apr 17 17:57:59.403274 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:57:59.403240 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90"} err="failed to get container status \"c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90\": rpc error: code = NotFound desc = could not find container \"c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90\": container with ID starting with c82f48dfa7b05021c6dd7f8669a8699f3f93bf9adea506fe42abe3e85db62f90 not found: ID does not exist" Apr 17 17:58:01.051978 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:01.051947 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-97cwd_c417a2cc-773d-4c5c-92c9-c3f262edf566/authorino/0.log" Apr 17 17:58:01.190929 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:01.190904 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-r2tw7_cbcc1b92-0ae7-45c5-9394-e10768ea865e/kuadrant-console-plugin/0.log" Apr 17 17:58:01.195055 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:01.195028 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4abef8-ded0-4f91-8c1c-1afbc875ec40" path="/var/lib/kubelet/pods/cb4abef8-ded0-4f91-8c1c-1afbc875ec40/volumes" Apr 17 17:58:02.321504 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:02.321474 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5b2566b1-eed4-4934-ad70-704adac645ac/alertmanager/0.log" Apr 17 17:58:02.357709 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:02.357683 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5b2566b1-eed4-4934-ad70-704adac645ac/config-reloader/0.log" Apr 17 17:58:02.385600 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:02.385567 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5b2566b1-eed4-4934-ad70-704adac645ac/kube-rbac-proxy-web/0.log" Apr 17 17:58:02.410562 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:02.410539 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5b2566b1-eed4-4934-ad70-704adac645ac/kube-rbac-proxy/0.log" Apr 17 17:58:02.436886 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:02.436861 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5b2566b1-eed4-4934-ad70-704adac645ac/kube-rbac-proxy-metric/0.log" Apr 17 17:58:02.464327 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:02.464303 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5b2566b1-eed4-4934-ad70-704adac645ac/prom-label-proxy/0.log" Apr 17 17:58:02.491112 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:02.491079 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5b2566b1-eed4-4934-ad70-704adac645ac/init-config-reloader/0.log" Apr 17 17:58:02.547193 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:02.547158 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zl5fr_f65f4a7f-b841-425d-b6cf-02bd9f48fd69/cluster-monitoring-operator/0.log" Apr 17 17:58:02.689963 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:02.689883 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-8b7cf6bd8-jjfpm_9019f918-5ea9-4140-aa80-f6a686c46531/metrics-server/0.log" Apr 17 17:58:02.888508 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:02.888481 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lcv9k_f3361750-3e11-42c6-b36a-e43635abdcfe/node-exporter/0.log" Apr 17 17:58:02.919550 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:02.919526 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lcv9k_f3361750-3e11-42c6-b36a-e43635abdcfe/kube-rbac-proxy/0.log" Apr 17 17:58:02.951583 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:02.951494 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lcv9k_f3361750-3e11-42c6-b36a-e43635abdcfe/init-textfile/0.log" Apr 17 17:58:03.523157 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:03.523129 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-74bnv_0e9e4663-293c-4ae6-b4a9-c7dd6e747e96/prometheus-operator-admission-webhook/0.log" Apr 17 17:58:03.564645 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:03.564610 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-648b7db8cc-wk4m9_5ce76d2b-04a7-463e-bd7c-8f6237da5475/telemeter-client/0.log" Apr 17 17:58:03.589599 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:03.589574 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-648b7db8cc-wk4m9_5ce76d2b-04a7-463e-bd7c-8f6237da5475/reload/0.log" Apr 17 17:58:03.618916 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:03.618891 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-648b7db8cc-wk4m9_5ce76d2b-04a7-463e-bd7c-8f6237da5475/kube-rbac-proxy/0.log" Apr 17 17:58:03.671206 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:03.671175 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cf98947dc-vk8tl_3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd/thanos-query/0.log" Apr 17 17:58:03.710081 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:03.710052 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cf98947dc-vk8tl_3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd/kube-rbac-proxy-web/0.log" Apr 17 17:58:03.742409 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:03.742378 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cf98947dc-vk8tl_3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd/kube-rbac-proxy/0.log" Apr 17 17:58:03.781491 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:03.781424 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cf98947dc-vk8tl_3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd/prom-label-proxy/0.log" Apr 17 17:58:03.821813 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:03.821775 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cf98947dc-vk8tl_3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd/kube-rbac-proxy-rules/0.log" Apr 17 17:58:03.857776 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:03.857748 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cf98947dc-vk8tl_3fbf76fd-2efd-4c3f-9e4d-de9aa98758dd/kube-rbac-proxy-metrics/0.log" Apr 17 17:58:04.959606 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:04.959563 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-h9fr2_b697600b-a4a4-48b8-b42e-9965d51b283c/networking-console-plugin/0.log" Apr 17 17:58:05.405456 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.405426 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l"] Apr 17 17:58:05.405815 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.405803 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb4abef8-ded0-4f91-8c1c-1afbc875ec40" containerName="gather" Apr 17 17:58:05.405877 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.405817 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4abef8-ded0-4f91-8c1c-1afbc875ec40" containerName="gather" Apr 17 17:58:05.405877 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.405831 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb4abef8-ded0-4f91-8c1c-1afbc875ec40" containerName="copy" Apr 17 17:58:05.405877 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.405851 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4abef8-ded0-4f91-8c1c-1afbc875ec40" containerName="copy" Apr 17 17:58:05.405979 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.405931 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb4abef8-ded0-4f91-8c1c-1afbc875ec40" containerName="gather" Apr 17 17:58:05.405979 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.405939 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb4abef8-ded0-4f91-8c1c-1afbc875ec40" containerName="copy" Apr 17 17:58:05.412827 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.412803 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.416230 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.416210 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6j5hz\"/\"default-dockercfg-f7m6d\"" Apr 17 17:58:05.417272 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.417254 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6j5hz\"/\"openshift-service-ca.crt\"" Apr 17 17:58:05.417370 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.417355 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6j5hz\"/\"kube-root-ca.crt\"" Apr 17 17:58:05.426010 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.425990 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l"] Apr 17 17:58:05.489682 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.489650 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-proc\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.489886 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.489688 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-lib-modules\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.489886 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.489788 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-podres\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.489886 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.489865 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbd9s\" (UniqueName: \"kubernetes.io/projected/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-kube-api-access-nbd9s\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.490021 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.489900 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-sys\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.499372 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.499344 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/1.log" Apr 17 17:58:05.506217 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.506183 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sb45j_48466d02-4d96-4b89-b43e-1adc61971469/console-operator/2.log" Apr 17 17:58:05.591118 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.591087 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-podres\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.591274 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.591164 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbd9s\" (UniqueName: \"kubernetes.io/projected/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-kube-api-access-nbd9s\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.591274 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.591199 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-sys\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.591274 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.591255 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-proc\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.591439 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.591269 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-podres\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.591439 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.591289 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-lib-modules\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.591439 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.591332 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-proc\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.591439 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.591362 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-sys\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.591566 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.591438 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-lib-modules\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.599770 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.599742 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbd9s\" (UniqueName: \"kubernetes.io/projected/ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b-kube-api-access-nbd9s\") pod \"perf-node-gather-daemonset-vgb9l\" (UID: \"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.722117 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.722040 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:05.848053 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.848027 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l"] Apr 17 17:58:05.849132 ip-10-0-143-59 kubenswrapper[2565]: W0417 17:58:05.849106 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podffdd8dd5_f6a2_40a8_98d2_3f693b7b769b.slice/crio-ddbd613df4bfb8d91f26b046c8be80c6093a7fe4221d111e9d7a27f177cc72d9 WatchSource:0}: Error finding container ddbd613df4bfb8d91f26b046c8be80c6093a7fe4221d111e9d7a27f177cc72d9: Status 404 returned error can't find the container with id ddbd613df4bfb8d91f26b046c8be80c6093a7fe4221d111e9d7a27f177cc72d9 Apr 17 17:58:05.995912 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:05.995827 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-lnv76_66303fe9-2e35-470e-8d83-6c31c7481520/download-server/0.log" Apr 17 17:58:06.410090 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:06.410059 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" event={"ID":"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b","Type":"ContainerStarted","Data":"6ac1815e07e794273fa28456351e4aba6a4c404086b566600b2f467fdd737a41"} Apr 17 17:58:06.410276 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:06.410099 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:06.410276 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:06.410110 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" event={"ID":"ffdd8dd5-f6a2-40a8-98d2-3f693b7b769b","Type":"ContainerStarted","Data":"ddbd613df4bfb8d91f26b046c8be80c6093a7fe4221d111e9d7a27f177cc72d9"} Apr 17 17:58:06.431742 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:06.431694 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" podStartSLOduration=1.431680159 podStartE2EDuration="1.431680159s" podCreationTimestamp="2026-04-17 17:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:58:06.428258031 +0000 UTC m=+2035.760875677" watchObservedRunningTime="2026-04-17 17:58:06.431680159 +0000 UTC m=+2035.764297805" Apr 17 17:58:06.456524 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:06.456499 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-ct24q_19432bca-a912-4225-ab78-b273f482ec76/volume-data-source-validator/0.log" Apr 17 17:58:07.470763 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:07.470731 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zlw54_40520684-848d-46b0-8288-7c708075eda2/dns/0.log" Apr 17 17:58:07.496293 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:07.496266 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zlw54_40520684-848d-46b0-8288-7c708075eda2/kube-rbac-proxy/0.log" Apr 17 17:58:07.555570 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:07.555542 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-f54jc_1ac6ac62-b297-4aad-a58f-12c981987869/dns-node-resolver/0.log" Apr 17 17:58:08.144988 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:08.144957 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rkk2d_a04f786b-b1fc-4078-9d22-1b263b785992/node-ca/0.log" Apr 17 17:58:09.172448 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:09.172414 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-564dfc4b58-br5kx_fba1d80e-239c-4b35-afc0-5052c340a9ee/router/0.log" Apr 17 17:58:09.686526 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:09.686491 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r4jgw_8cd66734-7425-4cd0-99cb-727b3059a1ed/serve-healthcheck-canary/0.log" Apr 17 17:58:10.185451 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:10.185418 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-mh4jk_f175c7b4-7e01-4c45-b098-27c87d4ba139/insights-operator/0.log" Apr 17 17:58:10.185916 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:10.185800 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-mh4jk_f175c7b4-7e01-4c45-b098-27c87d4ba139/insights-operator/1.log" Apr 17 17:58:10.298777 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:10.298753 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qhbg6_e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523/kube-rbac-proxy/0.log" Apr 17 17:58:10.328928 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:10.328901 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qhbg6_e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523/exporter/0.log" Apr 17 17:58:10.361365 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:10.361338 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qhbg6_e04dca9e-5b67-4a1c-9c5a-5a5cd5fad523/extractor/0.log" Apr 17 17:58:12.424315 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:12.424284 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-vgb9l" Apr 17 17:58:13.167654 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:13.167622 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6dd684f56d-x5nwj_34cb13f4-0a7d-4a07-9b4c-2e858b17357c/manager/0.log" Apr 17 17:58:13.878513 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:13.878480 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-7674b4456b-9l4k4_4d80993d-a624-4870-bd9e-9b65f601f330/manager/0.log" Apr 17 17:58:14.310280 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:14.310245 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-24m8k_49d94812-165f-4f16-8f9e-0fa53631f6c4/s3-init/0.log" Apr 17 17:58:14.345792 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:14.345763 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-2m8vh_dbed05be-4c57-475a-94bf-b997866f6146/seaweedfs/0.log" Apr 17 17:58:20.089203 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:20.089164 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gq4sp_30189955-9e22-4280-bbeb-b99c4bea9d98/kube-storage-version-migrator-operator/0.log" Apr 17 17:58:20.097190 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:20.097165 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gq4sp_30189955-9e22-4280-bbeb-b99c4bea9d98/kube-storage-version-migrator-operator/1.log" Apr 17 17:58:21.329853 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:21.329811 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hc4g_22fe4e50-13fd-4ae5-b9a6-1552184b400f/kube-multus-additional-cni-plugins/0.log" Apr 17 17:58:21.355164 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:21.355140 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hc4g_22fe4e50-13fd-4ae5-b9a6-1552184b400f/egress-router-binary-copy/0.log" Apr 17 17:58:21.384435 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:21.384410 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hc4g_22fe4e50-13fd-4ae5-b9a6-1552184b400f/cni-plugins/0.log" Apr 17 17:58:21.421433 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:21.421398 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hc4g_22fe4e50-13fd-4ae5-b9a6-1552184b400f/bond-cni-plugin/0.log" Apr 17 17:58:21.455319 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:21.455293 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hc4g_22fe4e50-13fd-4ae5-b9a6-1552184b400f/routeoverride-cni/0.log" Apr 17 17:58:21.493389 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:21.493361 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hc4g_22fe4e50-13fd-4ae5-b9a6-1552184b400f/whereabouts-cni-bincopy/0.log" Apr 17 17:58:21.523187 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:21.523161 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2hc4g_22fe4e50-13fd-4ae5-b9a6-1552184b400f/whereabouts-cni/0.log" Apr 17 17:58:22.021988 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:22.021904 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t2kcl_7c713c7a-430f-48a4-9274-ca5da277991e/kube-multus/0.log" Apr 17 17:58:22.091420 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:22.091377 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-clsml_053402a3-0f05-4423-9697-95ba118cec9c/network-metrics-daemon/0.log" Apr 17 17:58:22.122230 ip-10-0-143-59 kubenswrapper[2565]: I0417 17:58:22.122207 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-clsml_053402a3-0f05-4423-9697-95ba118cec9c/kube-rbac-proxy/0.log"