Apr 24 21:14:10.655934 ip-10-0-141-46 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:14:10.655946 ip-10-0-141-46 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:14:10.655953 ip-10-0-141-46 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:14:10.656166 ip-10-0-141-46 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:14:20.685708 ip-10-0-141-46 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:14:20.685727 ip-10-0-141-46 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 4ba4859f4e7e4edfb2f6e3cd92baace7 -- Apr 24 21:16:41.269407 ip-10-0-141-46 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:16:41.753315 ip-10-0-141-46 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:41.753315 ip-10-0-141-46 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:16:41.753315 ip-10-0-141-46 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:41.753315 ip-10-0-141-46 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:16:41.753315 ip-10-0-141-46 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:16:41.757173 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.757076 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:16:41.759604 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759587 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:41.759604 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759604 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759608 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759611 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759614 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759617 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759630 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759634 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759637 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759639 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759642 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759645 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759647 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759650 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759653 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759655 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759658 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759661 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759663 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759666 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759668 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:41.759708 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759671 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759673 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759676 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759678 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759681 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759684 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759686 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759690 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759694 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759697 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759699 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759702 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759705 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759708 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759711 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759714 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759717 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759720 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759731 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:41.760207 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759734 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759737 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759739 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759742 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759746 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759749 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759751 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759754 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759756 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759759 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759761 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759764 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759766 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759769 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759772 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759774 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759777 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759779 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759782 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759784 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:41.760913 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759787 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759789 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759792 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759794 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759796 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759799 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759801 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759805 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759808 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759810 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759813 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759815 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759823 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759826 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759829 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759831 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759834 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759837 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759839 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759841 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:41.761448 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759844 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:41.762182 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759846 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:41.762182 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759858 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:41.762182 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759861 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:41.762182 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759864 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:41.762182 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.759867 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762227 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762245 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762252 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762258 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762264 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762269 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762274 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762278 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762282 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762287 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762291 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762295 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762299 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762304 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762308 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762312 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762316 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762320 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762324 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:41.762406 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762329 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762333 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762337 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762341 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762345 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762350 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762354 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762359 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762363 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762367 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762371 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762375 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762379 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762384 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762389 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762393 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762398 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762402 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762407 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762411 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:41.763287 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762414 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762419 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762424 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762428 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762435 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762443 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762448 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762453 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762457 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762462 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762467 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762471 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762476 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762480 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762486 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762490 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762494 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762498 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762502 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:41.764238 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762506 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762511 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762515 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762518 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762522 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762527 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762531 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762536 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762540 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762545 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762549 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762553 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762557 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762561 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762566 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762570 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762574 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762578 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762584 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762588 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:41.764748 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762594 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762598 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762602 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762607 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762610 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762615 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762619 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.762623 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762733 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762744 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762756 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762763 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762770 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762775 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762782 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762789 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762795 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762800 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762806 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762816 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762821 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762826 2575 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:16:41.765353 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762831 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762836 2575 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762841 2575 flags.go:64] FLAG: --cloud-config="" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762845 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762850 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762857 2575 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762861 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762866 2575 flags.go:64] FLAG: --config-dir="" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762871 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762876 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762883 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762889 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762894 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762899 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762904 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762909 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762913 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762918 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762923 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762930 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762935 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762940 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762945 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762950 2575 flags.go:64] FLAG: --enable-server="true" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762954 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:16:41.766081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762962 2575 flags.go:64] FLAG: --event-burst="100" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762968 2575 flags.go:64] FLAG: --event-qps="50" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762973 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762978 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762985 2575 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762992 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.762996 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763002 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763007 2575 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763012 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763016 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763021 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763026 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763031 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763036 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763041 2575 flags.go:64] FLAG: --feature-gates="" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763047 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763052 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763057 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763103 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763110 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763116 2575 flags.go:64] FLAG: --help="false" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763121 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-141-46.ec2.internal" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763126 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:16:41.766740 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763132 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763137 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763143 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763149 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763154 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763158 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763163 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763168 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763173 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763178 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763183 2575 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763189 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763195 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763200 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763204 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763210 2575 flags.go:64] FLAG: --lock-file="" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763215 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763220 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763225 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763237 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763242 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763246 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763251 2575 flags.go:64] FLAG: --logging-format="text" Apr 24 21:16:41.767436 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763256 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763262 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763266 2575 flags.go:64] FLAG: --manifest-url="" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763271 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763278 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763284 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763290 2575 flags.go:64] FLAG: --max-pods="110" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763295 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763299 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763305 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763309 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763314 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763319 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763323 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763342 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763347 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763353 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763357 2575 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763362 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763372 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763377 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763385 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763389 2575 flags.go:64] FLAG: --port="10250" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763394 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:16:41.768000 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763399 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01820c531203892e2" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763404 2575 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763410 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763415 2575 flags.go:64] FLAG: --register-node="true" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763420 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763425 2575 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763431 2575 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763436 2575 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763440 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763445 2575 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763452 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763457 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763462 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763467 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763472 2575 flags.go:64] FLAG: --runonce="false" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763477 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763482 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763487 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763492 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763497 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763502 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763507 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763512 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763517 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763521 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763526 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:16:41.768624 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763531 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763537 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763542 2575 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763551 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763560 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763565 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763569 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763576 2575 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763581 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763586 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763591 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763598 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763603 2575 flags.go:64] FLAG: --v="2" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763609 2575 flags.go:64] FLAG: --version="false" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763616 2575 flags.go:64] FLAG: --vmodule="" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763623 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.763629 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763794 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763801 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763806 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763811 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763816 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763821 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763826 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:41.769267 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763830 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763834 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763839 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763843 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763848 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763853 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763857 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763861 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763865 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763869 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763874 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763880 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763885 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763889 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763894 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763898 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763903 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763907 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763911 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:41.769868 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763915 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763919 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763923 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763928 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763933 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763939 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763945 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763950 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763955 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763959 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763965 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763970 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763975 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763980 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763985 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.763990 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764018 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764026 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764032 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:41.770445 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764036 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764041 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764047 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764052 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764056 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764081 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764086 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764091 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764095 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764099 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764103 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764107 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764111 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764115 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764119 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764123 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764128 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764132 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764136 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764140 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:41.770928 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764145 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764149 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764153 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764156 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764161 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764167 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764171 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764175 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764181 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764187 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764192 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764196 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764200 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764204 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764208 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764213 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764217 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764223 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764228 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:41.771444 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764232 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:41.771900 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.764236 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:41.771900 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.764978 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:41.772632 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.772609 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:16:41.772671 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.772633 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:16:41.772702 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772692 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:41.772702 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772698 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:41.772702 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772702 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772706 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772710 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772713 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772716 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772719 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772722 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772725 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772727 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772730 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772733 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772736 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772738 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772741 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772744 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772746 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772749 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772752 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772754 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:41.772785 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772757 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772760 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772762 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772765 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772768 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772770 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772773 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772776 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772778 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772781 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772784 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772787 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772790 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772793 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772796 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772799 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772802 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772804 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772807 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772810 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:41.773288 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772812 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772815 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772817 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772820 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772822 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772825 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772829 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772833 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772835 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772838 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772840 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772843 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772846 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772848 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772851 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772854 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772857 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772859 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772862 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:41.773769 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772865 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772867 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772870 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772872 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772875 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772878 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772880 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772883 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772885 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772888 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772890 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772893 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772895 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772898 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772900 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772903 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772906 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772908 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772911 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772915 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:41.774290 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772919 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772922 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772924 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772928 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772932 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.772934 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.772940 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773055 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773059 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773079 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773083 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773086 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773089 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773092 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773094 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773097 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:16:41.774780 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773100 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773103 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773106 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773108 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773111 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773114 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773116 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773119 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773122 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773124 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773127 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773130 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773132 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773134 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773137 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773140 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773142 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773145 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773147 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773149 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:16:41.775266 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773153 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773155 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773159 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773164 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773167 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773170 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773172 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773175 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773178 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773180 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773183 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773186 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773188 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773191 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773194 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773196 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773199 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773201 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773204 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:16:41.775741 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773206 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773209 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773212 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773214 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773217 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773219 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773222 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773225 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773227 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773230 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773232 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773235 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773238 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773241 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773243 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773246 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773249 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773252 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773254 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773257 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:16:41.776222 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773260 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773262 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773264 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773267 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773270 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773272 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773275 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773277 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773281 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773283 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773285 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773288 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773291 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773293 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773296 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773299 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773303 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:16:41.776725 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:41.773306 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:16:41.777229 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.773312 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:16:41.777229 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.774180 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:16:41.777603 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.777587 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:16:41.778427 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.778414 2575 server.go:1019] "Starting client certificate rotation" Apr 24 21:16:41.778531 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.778514 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:41.778561 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.778552 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:16:41.805789 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.805762 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:41.810521 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.810502 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:16:41.820564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.820385 2575 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:16:41.826476 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.826451 2575 log.go:25] "Validated CRI v1 image API" Apr 24 21:16:41.829835 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.829810 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:16:41.833009 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.832986 2575 fs.go:135] Filesystem UUIDs: map[1930513e-7853-4067-a682-e1cd7e9dcc62:/dev/nvme0n1p3 4941c64f-87e9-4f94-a7b1-c1de2b36443a:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 24 21:16:41.833097 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.833008 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:16:41.837779 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.837760 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:41.839105 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.838986 2575 manager.go:217] Machine: {Timestamp:2026-04-24 21:16:41.837137272 +0000 UTC m=+0.438292823 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097750 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28b2f93a0c67ec5069054c2ee1c891 SystemUUID:ec28b2f9-3a0c-67ec-5069-054c2ee1c891 BootID:4ba4859f-4e7e-4edf-b2f6-e3cd92baace7 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:18:43:86:cf:e7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:18:43:86:cf:e7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:fd:2b:04:55:c5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:16:41.839105 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.839102 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:16:41.839207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.839197 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:16:41.840040 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.840012 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:16:41.840194 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.840040 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-46.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:16:41.840244 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.840200 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:16:41.840244 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.840209 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:16:41.840244 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.840223 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:41.840244 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.840237 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:16:41.841483 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.841472 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:41.841602 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.841593 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:16:41.844145 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.844134 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:16:41.844192 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.844148 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:16:41.844192 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.844164 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:16:41.844192 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.844174 2575 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:16:41.844192 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.844182 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:16:41.845317 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.845301 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:41.845317 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.845320 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:16:41.848982 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.848962 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:16:41.850372 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.850358 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:16:41.852002 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.851987 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:16:41.852060 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852006 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:16:41.852060 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852013 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:16:41.852060 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852019 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:16:41.852060 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852025 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:16:41.852060 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852031 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:16:41.852060 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852036 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:16:41.852060 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852042 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:16:41.852060 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852049 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:16:41.852060 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852056 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:16:41.852309 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852079 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:16:41.852309 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852089 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:16:41.852978 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852964 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:16:41.853012 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.852979 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:16:41.855908 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.855888 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-46.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:16:41.856021 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:41.855956 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-46.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:16:41.856088 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:41.856054 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:16:41.856940 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.856928 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:16:41.856980 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.856968 2575 server.go:1295] "Started kubelet" Apr 24 21:16:41.857048 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.857026 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:16:41.857150 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.857101 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:16:41.857192 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.857169 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:16:41.857562 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.857542 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tkgsx" Apr 24 21:16:41.857925 ip-10-0-141-46 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:16:41.858627 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.858603 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:16:41.858981 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.858958 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:16:41.862806 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.862780 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tkgsx" Apr 24 21:16:41.868091 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:41.865435 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-46.ec2.internal.18a967915451d744 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-46.ec2.internal,UID:ip-10-0-141-46.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-46.ec2.internal,},FirstTimestamp:2026-04-24 21:16:41.856939844 +0000 UTC m=+0.458095395,LastTimestamp:2026-04-24 21:16:41.856939844 +0000 UTC m=+0.458095395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-46.ec2.internal,}" Apr 24 21:16:41.868516 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.868501 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:16:41.868562 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.868521 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:41.869127 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.869102 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:16:41.869127 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.869102 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:16:41.869264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.869136 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:16:41.869264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.869233 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:16:41.869264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.869242 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:16:41.869409 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:41.869389 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:41.869645 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:41.869622 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:16:41.869730 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.869722 2575 factory.go:55] Registering systemd factory Apr 24 21:16:41.869784 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.869765 2575 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:16:41.869973 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.869958 2575 factory.go:153] Registering CRI-O factory Apr 24 21:16:41.870039 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.869976 2575 factory.go:223] Registration of the crio container factory successfully Apr 24 21:16:41.870039 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.870031 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:16:41.870159 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.870053 2575 factory.go:103] Registering Raw factory Apr 24 21:16:41.870159 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.870080 2575 manager.go:1196] Started watching for new ooms in manager Apr 24 21:16:41.870467 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.870456 2575 manager.go:319] Starting recovery of all containers Apr 24 21:16:41.876777 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.876734 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:16:41.879649 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.879625 2575 manager.go:324] Recovery completed Apr 24 21:16:41.880674 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.880647 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:41.883146 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:41.883112 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-46.ec2.internal\" not found" node="ip-10-0-141-46.ec2.internal" Apr 24 21:16:41.884683 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.884671 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:41.889282 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.889258 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:41.889360 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.889289 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:41.889360 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.889301 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:41.889831 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.889818 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:16:41.889831 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.889829 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:16:41.889899 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.889847 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:16:41.891988 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.891976 2575 policy_none.go:49] "None policy: Start" Apr 24 21:16:41.892039 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.891992 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:16:41.892337 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.892328 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:16:41.924735 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.924717 2575 manager.go:341] "Starting Device Plugin manager" Apr 24 21:16:41.924852 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:41.924793 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:16:41.924852 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.924807 2575 server.go:85] "Starting device plugin registration server" Apr 24 21:16:41.925115 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.925103 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:16:41.925179 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.925120 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:16:41.925271 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.925251 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:16:41.925347 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.925335 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:16:41.925347 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:41.925347 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:16:41.928049 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:41.927566 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:16:41.928049 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:41.927608 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:42.000944 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.000915 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:16:42.000944 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.000951 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:16:42.001188 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.000971 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:16:42.001188 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.000978 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:16:42.001188 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.001020 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:16:42.004271 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.004219 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:42.026220 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.026194 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:42.028190 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.028173 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:42.028303 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.028207 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:42.028303 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.028221 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:42.028303 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.028252 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.035756 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.035739 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.035867 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.035767 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-46.ec2.internal\": node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:42.048735 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.048711 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:42.101311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.101247 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal"] Apr 24 21:16:42.101438 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.101365 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:42.102985 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.102967 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:42.103084 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.103001 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:42.103084 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.103010 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:42.104861 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.104847 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:42.104983 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.104969 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.105033 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.105003 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:42.106237 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.106219 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:42.106343 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.106248 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:42.106343 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.106223 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:42.106343 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.106279 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:42.106343 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.106291 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:42.106343 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.106259 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:42.107346 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.107327 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.107448 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.107356 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:16:42.108134 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.108117 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:16:42.108212 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.108147 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:16:42.108212 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.108163 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:16:42.124675 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.124654 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-46.ec2.internal\" not found" node="ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.128180 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.128163 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-46.ec2.internal\" not found" node="ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.149308 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.149282 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:42.170698 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.170668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ea9c91cb8023c6edf0f87546014505a5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal\" (UID: \"ea9c91cb8023c6edf0f87546014505a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.249690 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.249657 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:42.271103 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.270973 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6551c028e13aff466c38397d8a508ac4-config\") pod \"kube-apiserver-proxy-ip-10-0-141-46.ec2.internal\" (UID: \"6551c028e13aff466c38397d8a508ac4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.271247 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.271105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ea9c91cb8023c6edf0f87546014505a5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal\" (UID: \"ea9c91cb8023c6edf0f87546014505a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.271247 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.271135 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea9c91cb8023c6edf0f87546014505a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal\" (UID: \"ea9c91cb8023c6edf0f87546014505a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.271247 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.271194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ea9c91cb8023c6edf0f87546014505a5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal\" (UID: \"ea9c91cb8023c6edf0f87546014505a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.350452 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.350413 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:42.371759 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.371730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea9c91cb8023c6edf0f87546014505a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal\" (UID: \"ea9c91cb8023c6edf0f87546014505a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.371850 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.371782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6551c028e13aff466c38397d8a508ac4-config\") pod \"kube-apiserver-proxy-ip-10-0-141-46.ec2.internal\" (UID: \"6551c028e13aff466c38397d8a508ac4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.371850 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.371809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea9c91cb8023c6edf0f87546014505a5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal\" (UID: \"ea9c91cb8023c6edf0f87546014505a5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.371850 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.371827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6551c028e13aff466c38397d8a508ac4-config\") pod \"kube-apiserver-proxy-ip-10-0-141-46.ec2.internal\" (UID: \"6551c028e13aff466c38397d8a508ac4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.426934 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.426906 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.430721 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.430702 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" Apr 24 21:16:42.451124 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.451091 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:42.551661 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.551566 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:42.652077 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.652044 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:42.752585 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.752551 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:42.778038 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.778011 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:16:42.778677 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.778177 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:16:42.778677 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.778212 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:16:42.828704 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.828129 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:42.853550 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.853518 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:42.865437 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.865400 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:11:41 +0000 UTC" deadline="2027-10-21 03:22:46.736223169 +0000 UTC" Apr 24 21:16:42.865437 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.865431 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13062h6m3.87079505s" Apr 24 21:16:42.869316 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.869290 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:16:42.878656 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.878633 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:16:42.911437 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:42.911395 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea9c91cb8023c6edf0f87546014505a5.slice/crio-15a90ea8fd9b25e620b3d5f16943a505f0d48956f147500f11d2e8b5a4842d29 WatchSource:0}: Error finding container 15a90ea8fd9b25e620b3d5f16943a505f0d48956f147500f11d2e8b5a4842d29: Status 404 returned error can't find the container with id 15a90ea8fd9b25e620b3d5f16943a505f0d48956f147500f11d2e8b5a4842d29 Apr 24 21:16:42.911953 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:42.911925 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6551c028e13aff466c38397d8a508ac4.slice/crio-cb10f6c39c1e2ff2be1a6441b6101b3ac6f5c2497e0f8789280e51bef837aade WatchSource:0}: Error finding container cb10f6c39c1e2ff2be1a6441b6101b3ac6f5c2497e0f8789280e51bef837aade: Status 404 returned error can't find the container with id cb10f6c39c1e2ff2be1a6441b6101b3ac6f5c2497e0f8789280e51bef837aade Apr 24 21:16:42.915568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.915551 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:16:42.954224 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:42.954185 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:42.975797 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.975770 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-n2rnm" Apr 24 21:16:42.982702 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:42.982685 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-n2rnm" Apr 24 21:16:43.004686 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.004639 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" event={"ID":"6551c028e13aff466c38397d8a508ac4","Type":"ContainerStarted","Data":"cb10f6c39c1e2ff2be1a6441b6101b3ac6f5c2497e0f8789280e51bef837aade"} Apr 24 21:16:43.005639 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.005620 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" event={"ID":"ea9c91cb8023c6edf0f87546014505a5","Type":"ContainerStarted","Data":"15a90ea8fd9b25e620b3d5f16943a505f0d48956f147500f11d2e8b5a4842d29"} Apr 24 21:16:43.054851 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:43.054815 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:43.155449 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:43.155368 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:43.255910 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:43.255876 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:43.356652 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:43.356608 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-46.ec2.internal\" not found" Apr 24 21:16:43.449114 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.448975 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:43.469626 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.469592 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" Apr 24 21:16:43.478429 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.478303 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:43.479626 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.479356 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" Apr 24 21:16:43.488918 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.488794 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:16:43.845659 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.845574 2575 apiserver.go:52] "Watching apiserver" Apr 24 21:16:43.852651 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.852626 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:16:43.853020 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.852993 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal","openshift-multus/multus-additional-cni-plugins-r8829","openshift-multus/multus-vlds8","openshift-multus/network-metrics-daemon-qm8hh","openshift-network-diagnostics/network-check-target-qfs2f","openshift-network-operator/iptables-alerter-8v98v","openshift-ovn-kubernetes/ovnkube-node-g9z5w","kube-system/konnectivity-agent-5r6d9","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4","openshift-cluster-node-tuning-operator/tuned-lhl59","openshift-image-registry/node-ca-d7mjl","kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal"] Apr 24 21:16:43.855130 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.855105 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.858541 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.858307 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.858541 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.858374 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-czgd4\"" Apr 24 21:16:43.858541 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.858479 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:16:43.858541 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.858377 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:43.858785 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:43.858562 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:16:43.858839 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.858815 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:16:43.858943 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.858891 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:16:43.858943 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.858909 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:16:43.859057 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.858942 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:16:43.860269 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.860247 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:43.860368 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:43.860319 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:16:43.860453 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.860435 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-t525h\"" Apr 24 21:16:43.860578 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.860557 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:16:43.862342 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.862017 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8v98v" Apr 24 21:16:43.864660 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.864562 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:16:43.866384 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.866037 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:43.866384 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.866053 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:43.866384 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.866101 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.866384 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.866237 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fzlbr\"" Apr 24 21:16:43.868367 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.868348 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:16:43.868473 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.868381 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5r6d9" Apr 24 21:16:43.868473 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.868398 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:16:43.868587 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.868503 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qcgmz\"" Apr 24 21:16:43.869494 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.869402 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:16:43.869494 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.869416 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:16:43.869494 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.869495 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:16:43.869703 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.869549 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:16:43.870682 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.870659 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:16:43.870781 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.870716 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:16:43.870965 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.870951 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j85kl\"" Apr 24 21:16:43.871055 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.870955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.872946 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.872927 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.873089 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.873057 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:16:43.873456 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.873352 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:16:43.873572 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.873552 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:16:43.873829 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.873756 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-ft8qp\"" Apr 24 21:16:43.875033 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.875015 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d7mjl" Apr 24 21:16:43.876756 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.876739 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:16:43.876756 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.876757 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lz4gl\"" Apr 24 21:16:43.876893 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.876774 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:16:43.879608 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.879588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f78dt\" (UniqueName: \"kubernetes.io/projected/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-kube-api-access-f78dt\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.879691 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.879622 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-run-ovn\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.879691 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.879672 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-multus-socket-dir-parent\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.879770 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.879706 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-var-lib-cni-multus\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.879770 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.879738 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-etc-openvswitch\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.879863 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.879772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-device-dir\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.879863 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.879806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-system-cni-dir\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.879959 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.879872 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-var-lib-kubelet\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.879959 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.879897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-etc-kubernetes\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.879959 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.879912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjgmw\" (UniqueName: \"kubernetes.io/projected/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-kube-api-access-mjgmw\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:43.879959 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.879931 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-cni-bin\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.880177 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.879989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-cni-netd\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.880177 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880027 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3b4de04-a724-4231-a103-ae88c77beb64-ovn-node-metrics-cert\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.880177 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880075 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-cni-binary-copy\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.880177 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880101 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-multus-daemon-config\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.880177 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880120 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2sr2\" (UniqueName: \"kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2\") pod \"network-check-target-qfs2f\" (UID: \"e42a906f-a474-4d11-8a0d-e8ef290b2e14\") " pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:43.880177 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880136 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-slash\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.880177 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880160 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-log-socket\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-os-release\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-kubelet\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880215 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm62h\" (UniqueName: \"kubernetes.io/projected/e3b4de04-a724-4231-a103-ae88c77beb64-kube-api-access-xm62h\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880241 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7glp\" (UniqueName: \"kubernetes.io/projected/8cfc495b-d79a-4bed-94c7-6101136f82e7-kube-api-access-l7glp\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880268 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880306 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-cnibin\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880346 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-run-k8s-cni-cncf-io\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880375 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-run-multus-certs\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880401 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c230c44d-8923-409d-a9e3-443872457536-iptables-alerter-script\") pod \"iptables-alerter-8v98v\" (UID: \"c230c44d-8923-409d-a9e3-443872457536\") " pod="openshift-network-operator/iptables-alerter-8v98v" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880432 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3b4de04-a724-4231-a103-ae88c77beb64-ovnkube-config\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-cnibin\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880483 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-os-release\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880507 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/af0483bc-341e-4d8b-a4d1-0e5fb17d43a8-konnectivity-ca\") pod \"konnectivity-agent-5r6d9\" (UID: \"af0483bc-341e-4d8b-a4d1-0e5fb17d43a8\") " pod="kube-system/konnectivity-agent-5r6d9" Apr 24 21:16:43.880560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880531 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-systemd-units\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880585 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-run-openvswitch\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880622 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-hostroot\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-run-ovn-kubernetes\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880672 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3b4de04-a724-4231-a103-ae88c77beb64-env-overrides\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880698 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-cni-binary-copy\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-run-netns\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcpdq\" (UniqueName: \"kubernetes.io/projected/c230c44d-8923-409d-a9e3-443872457536-kube-api-access-zcpdq\") pod \"iptables-alerter-8v98v\" (UID: \"c230c44d-8923-409d-a9e3-443872457536\") " pod="openshift-network-operator/iptables-alerter-8v98v" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880771 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-registration-dir\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880846 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-multus-conf-dir\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880871 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c230c44d-8923-409d-a9e3-443872457536-host-slash\") pod \"iptables-alerter-8v98v\" (UID: \"c230c44d-8923-409d-a9e3-443872457536\") " pod="openshift-network-operator/iptables-alerter-8v98v" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880896 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3b4de04-a724-4231-a103-ae88c77beb64-ovnkube-script-lib\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-etc-selinux\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.881311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.880992 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-var-lib-cni-bin\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.881015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-run-netns\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.881039 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-run-systemd\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.881085 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-socket-dir\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.881147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-system-cni-dir\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.881174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwrt6\" (UniqueName: \"kubernetes.io/projected/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-kube-api-access-dwrt6\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.881200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-multus-cni-dir\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.881248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-node-log\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.881271 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-sys-fs\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.881311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.881340 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/af0483bc-341e-4d8b-a4d1-0e5fb17d43a8-agent-certs\") pod \"konnectivity-agent-5r6d9\" (UID: \"af0483bc-341e-4d8b-a4d1-0e5fb17d43a8\") " pod="kube-system/konnectivity-agent-5r6d9" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.881366 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-var-lib-openvswitch\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.882030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.881912 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:16:43.882647 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.882302 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:16:43.882647 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.882326 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jrbtm\"" Apr 24 21:16:43.882749 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.882660 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:16:43.891007 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.890989 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:43.969654 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.969629 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:16:43.981577 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.981577 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981580 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3b4de04-a724-4231-a103-ae88c77beb64-ovnkube-script-lib\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.981843 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-etc-selinux\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.981843 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-run\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.981843 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.981843 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.981843 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-var-lib-cni-bin\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.981843 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981702 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-etc-selinux\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.982122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-var-lib-cni-bin\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.982122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-run-netns\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.982122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-run-systemd\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.982122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981970 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-socket-dir\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.982122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981792 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-run-netns\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.982122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.981814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.982122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982034 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-run-systemd\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.982122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d887b4fc-65dd-41bb-9013-2e633d3ab7a8-serviceca\") pod \"node-ca-d7mjl\" (UID: \"d887b4fc-65dd-41bb-9013-2e633d3ab7a8\") " pod="openshift-image-registry/node-ca-d7mjl" Apr 24 21:16:43.982122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-system-cni-dir\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.982122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-socket-dir\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwrt6\" (UniqueName: \"kubernetes.io/projected/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-kube-api-access-dwrt6\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-system-cni-dir\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982184 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-multus-cni-dir\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-multus-cni-dir\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-node-log\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-sys-fs\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d887b4fc-65dd-41bb-9013-2e633d3ab7a8-host\") pod \"node-ca-d7mjl\" (UID: \"d887b4fc-65dd-41bb-9013-2e633d3ab7a8\") " pod="openshift-image-registry/node-ca-d7mjl" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982321 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-sysconfig\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-node-log\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3b4de04-a724-4231-a103-ae88c77beb64-ovnkube-script-lib\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982376 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-sys-fs\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:43.982423 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/af0483bc-341e-4d8b-a4d1-0e5fb17d43a8-agent-certs\") pod \"konnectivity-agent-5r6d9\" (UID: \"af0483bc-341e-4d8b-a4d1-0e5fb17d43a8\") " pod="kube-system/konnectivity-agent-5r6d9" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:43.982499 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs podName:f3b8d86d-c179-4368-b025-0c7f41b2aa3e nodeName:}" failed. No retries permitted until 2026-04-24 21:16:44.48246625 +0000 UTC m=+3.083621790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs") pod "network-metrics-daemon-qm8hh" (UID: "f3b8d86d-c179-4368-b025-0c7f41b2aa3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-var-lib-openvswitch\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.982564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982546 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-host\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f78dt\" (UniqueName: \"kubernetes.io/projected/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-kube-api-access-f78dt\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-var-lib-openvswitch\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-run-ovn\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a470a779-5d88-4004-a798-9ea3cbcbfe6d-tmp\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982643 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-run-ovn\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-multus-socket-dir-parent\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-var-lib-cni-multus\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-multus-socket-dir-parent\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-etc-openvswitch\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-device-dir\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-var-lib-cni-multus\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982760 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-systemd\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982784 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-lib-modules\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-etc-openvswitch\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-system-cni-dir\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-device-dir\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982827 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:16:43.983370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-var-lib-kubelet\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-var-lib-kubelet\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982884 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-system-cni-dir\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-etc-kubernetes\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjgmw\" (UniqueName: \"kubernetes.io/projected/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-kube-api-access-mjgmw\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-cni-bin\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-etc-kubernetes\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-cni-netd\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.982997 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-cni-bin\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3b4de04-a724-4231-a103-ae88c77beb64-ovn-node-metrics-cert\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-cni-netd\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-modprobe-d\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983157 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-cni-binary-copy\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-sys\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-tuned\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-multus-daemon-config\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2sr2\" (UniqueName: \"kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2\") pod \"network-check-target-qfs2f\" (UID: \"e42a906f-a474-4d11-8a0d-e8ef290b2e14\") " pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-slash\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.984207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983301 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-log-socket\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983325 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-sysctl-conf\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-os-release\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-kubelet\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983398 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm62h\" (UniqueName: \"kubernetes.io/projected/e3b4de04-a724-4231-a103-ae88c77beb64-kube-api-access-xm62h\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7glp\" (UniqueName: \"kubernetes.io/projected/8cfc495b-d79a-4bed-94c7-6101136f82e7-kube-api-access-l7glp\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t7f6\" (UniqueName: \"kubernetes.io/projected/d887b4fc-65dd-41bb-9013-2e633d3ab7a8-kube-api-access-6t7f6\") pod \"node-ca-d7mjl\" (UID: \"d887b4fc-65dd-41bb-9013-2e633d3ab7a8\") " pod="openshift-image-registry/node-ca-d7mjl" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983478 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-sysctl-d\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983530 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-cnibin\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983553 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-run-k8s-cni-cncf-io\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983577 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-run-multus-certs\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c230c44d-8923-409d-a9e3-443872457536-iptables-alerter-script\") pod \"iptables-alerter-8v98v\" (UID: \"c230c44d-8923-409d-a9e3-443872457536\") " pod="openshift-network-operator/iptables-alerter-8v98v" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3b4de04-a724-4231-a103-ae88c77beb64-ovnkube-config\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983651 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-var-lib-kubelet\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-cni-binary-copy\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-cnibin\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.985051 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-os-release\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/af0483bc-341e-4d8b-a4d1-0e5fb17d43a8-konnectivity-ca\") pod \"konnectivity-agent-5r6d9\" (UID: \"af0483bc-341e-4d8b-a4d1-0e5fb17d43a8\") " pod="kube-system/konnectivity-agent-5r6d9" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983739 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:42 +0000 UTC" deadline="2027-12-22 13:15:15.845880453 +0000 UTC" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983756 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14559h58m31.862127345s" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-systemd-units\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-run-openvswitch\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-hostroot\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-run-ovn-kubernetes\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3b4de04-a724-4231-a103-ae88c77beb64-env-overrides\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-multus-daemon-config\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.983972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-cni-binary-copy\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-systemd-units\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-cnibin\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-os-release\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984288 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-hostroot\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-os-release\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984321 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-run-openvswitch\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-run-netns\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.985755 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984355 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-run-ovn-kubernetes\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcpdq\" (UniqueName: \"kubernetes.io/projected/c230c44d-8923-409d-a9e3-443872457536-kube-api-access-zcpdq\") pod \"iptables-alerter-8v98v\" (UID: \"c230c44d-8923-409d-a9e3-443872457536\") " pod="openshift-network-operator/iptables-alerter-8v98v" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-slash\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-registration-dir\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984436 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3b4de04-a724-4231-a103-ae88c77beb64-env-overrides\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984465 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-kubernetes\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-host-kubelet\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984563 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-registration-dir\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-cnibin\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c230c44d-8923-409d-a9e3-443872457536-iptables-alerter-script\") pod \"iptables-alerter-8v98v\" (UID: \"c230c44d-8923-409d-a9e3-443872457536\") " pod="openshift-network-operator/iptables-alerter-8v98v" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-run-k8s-cni-cncf-io\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcb6r\" (UniqueName: \"kubernetes.io/projected/a470a779-5d88-4004-a798-9ea3cbcbfe6d-kube-api-access-lcb6r\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984655 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-run-multus-certs\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-multus-conf-dir\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984734 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c230c44d-8923-409d-a9e3-443872457536-host-slash\") pod \"iptables-alerter-8v98v\" (UID: \"c230c44d-8923-409d-a9e3-443872457536\") " pod="openshift-network-operator/iptables-alerter-8v98v" Apr 24 21:16:43.986568 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8cfc495b-d79a-4bed-94c7-6101136f82e7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.987418 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984830 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-host-run-netns\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.987418 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.987418 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-multus-conf-dir\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.987418 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984867 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c230c44d-8923-409d-a9e3-443872457536-host-slash\") pod \"iptables-alerter-8v98v\" (UID: \"c230c44d-8923-409d-a9e3-443872457536\") " pod="openshift-network-operator/iptables-alerter-8v98v" Apr 24 21:16:43.987418 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.984895 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3b4de04-a724-4231-a103-ae88c77beb64-log-socket\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.987418 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.985138 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3b4de04-a724-4231-a103-ae88c77beb64-ovnkube-config\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.987418 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.985381 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-cni-binary-copy\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.987418 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.985382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.987418 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.985646 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/af0483bc-341e-4d8b-a4d1-0e5fb17d43a8-konnectivity-ca\") pod \"konnectivity-agent-5r6d9\" (UID: \"af0483bc-341e-4d8b-a4d1-0e5fb17d43a8\") " pod="kube-system/konnectivity-agent-5r6d9" Apr 24 21:16:43.987418 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.986254 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3b4de04-a724-4231-a103-ae88c77beb64-ovn-node-metrics-cert\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:43.987418 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.986336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/af0483bc-341e-4d8b-a4d1-0e5fb17d43a8-agent-certs\") pod \"konnectivity-agent-5r6d9\" (UID: \"af0483bc-341e-4d8b-a4d1-0e5fb17d43a8\") " pod="kube-system/konnectivity-agent-5r6d9" Apr 24 21:16:43.993679 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.993652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f78dt\" (UniqueName: \"kubernetes.io/projected/2d51ef4d-9ece-4f73-b21d-de62e4a3b68e-kube-api-access-f78dt\") pod \"multus-vlds8\" (UID: \"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e\") " pod="openshift-multus/multus-vlds8" Apr 24 21:16:43.994603 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.994554 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwrt6\" (UniqueName: \"kubernetes.io/projected/5b89c06b-ff11-4cc0-bd26-7f792a0f1702-kube-api-access-dwrt6\") pod \"multus-additional-cni-plugins-r8829\" (UID: \"5b89c06b-ff11-4cc0-bd26-7f792a0f1702\") " pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:43.995014 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:43.994984 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:43.995014 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:43.995008 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:43.995177 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:43.995023 2575 projected.go:194] Error preparing data for projected volume kube-api-access-b2sr2 for pod openshift-network-diagnostics/network-check-target-qfs2f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:43.995177 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:43.995097 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2 podName:e42a906f-a474-4d11-8a0d-e8ef290b2e14 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:44.495082641 +0000 UTC m=+3.096238190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-b2sr2" (UniqueName: "kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2") pod "network-check-target-qfs2f" (UID: "e42a906f-a474-4d11-8a0d-e8ef290b2e14") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:43.995285 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.995243 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcpdq\" (UniqueName: \"kubernetes.io/projected/c230c44d-8923-409d-a9e3-443872457536-kube-api-access-zcpdq\") pod \"iptables-alerter-8v98v\" (UID: \"c230c44d-8923-409d-a9e3-443872457536\") " pod="openshift-network-operator/iptables-alerter-8v98v" Apr 24 21:16:43.995734 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.995624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7glp\" (UniqueName: \"kubernetes.io/projected/8cfc495b-d79a-4bed-94c7-6101136f82e7-kube-api-access-l7glp\") pod \"aws-ebs-csi-driver-node-zqst4\" (UID: \"8cfc495b-d79a-4bed-94c7-6101136f82e7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:43.996137 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.996089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjgmw\" (UniqueName: \"kubernetes.io/projected/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-kube-api-access-mjgmw\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:43.996774 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:43.996751 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm62h\" (UniqueName: \"kubernetes.io/projected/e3b4de04-a724-4231-a103-ae88c77beb64-kube-api-access-xm62h\") pod \"ovnkube-node-g9z5w\" (UID: \"e3b4de04-a724-4231-a103-ae88c77beb64\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:44.085966 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.085930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a470a779-5d88-4004-a798-9ea3cbcbfe6d-tmp\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086163 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.085983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-systemd\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086163 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-lib-modules\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086163 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-modprobe-d\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086163 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086108 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-sys\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086163 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086114 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-systemd\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086163 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-tuned\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086362 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-sysctl-conf\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086362 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-lib-modules\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086362 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6t7f6\" (UniqueName: \"kubernetes.io/projected/d887b4fc-65dd-41bb-9013-2e633d3ab7a8-kube-api-access-6t7f6\") pod \"node-ca-d7mjl\" (UID: \"d887b4fc-65dd-41bb-9013-2e633d3ab7a8\") " pod="openshift-image-registry/node-ca-d7mjl" Apr 24 21:16:44.086362 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086260 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-modprobe-d\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086362 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-sysctl-conf\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086362 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-sys\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086595 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-sysctl-d\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086595 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-var-lib-kubelet\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086595 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-kubernetes\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086595 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcb6r\" (UniqueName: \"kubernetes.io/projected/a470a779-5d88-4004-a798-9ea3cbcbfe6d-kube-api-access-lcb6r\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086595 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-var-lib-kubelet\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-run\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086622 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-kubernetes\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d887b4fc-65dd-41bb-9013-2e633d3ab7a8-serviceca\") pod \"node-ca-d7mjl\" (UID: \"d887b4fc-65dd-41bb-9013-2e633d3ab7a8\") " pod="openshift-image-registry/node-ca-d7mjl" Apr 24 21:16:44.086840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086633 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-sysctl-d\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086659 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d887b4fc-65dd-41bb-9013-2e633d3ab7a8-host\") pod \"node-ca-d7mjl\" (UID: \"d887b4fc-65dd-41bb-9013-2e633d3ab7a8\") " pod="openshift-image-registry/node-ca-d7mjl" Apr 24 21:16:44.086840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-run\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086682 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-sysconfig\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d887b4fc-65dd-41bb-9013-2e633d3ab7a8-host\") pod \"node-ca-d7mjl\" (UID: \"d887b4fc-65dd-41bb-9013-2e633d3ab7a8\") " pod="openshift-image-registry/node-ca-d7mjl" Apr 24 21:16:44.086840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-host\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-sysconfig\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.086840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.086802 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a470a779-5d88-4004-a798-9ea3cbcbfe6d-host\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.087405 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.087047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d887b4fc-65dd-41bb-9013-2e633d3ab7a8-serviceca\") pod \"node-ca-d7mjl\" (UID: \"d887b4fc-65dd-41bb-9013-2e633d3ab7a8\") " pod="openshift-image-registry/node-ca-d7mjl" Apr 24 21:16:44.088604 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.088581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a470a779-5d88-4004-a798-9ea3cbcbfe6d-tmp\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.088737 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.088692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a470a779-5d88-4004-a798-9ea3cbcbfe6d-etc-tuned\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.094475 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.094452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t7f6\" (UniqueName: \"kubernetes.io/projected/d887b4fc-65dd-41bb-9013-2e633d3ab7a8-kube-api-access-6t7f6\") pod \"node-ca-d7mjl\" (UID: \"d887b4fc-65dd-41bb-9013-2e633d3ab7a8\") " pod="openshift-image-registry/node-ca-d7mjl" Apr 24 21:16:44.094741 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.094720 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcb6r\" (UniqueName: \"kubernetes.io/projected/a470a779-5d88-4004-a798-9ea3cbcbfe6d-kube-api-access-lcb6r\") pod \"tuned-lhl59\" (UID: \"a470a779-5d88-4004-a798-9ea3cbcbfe6d\") " pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.169175 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.167947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r8829" Apr 24 21:16:44.176723 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.176692 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vlds8" Apr 24 21:16:44.186242 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.186209 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8v98v" Apr 24 21:16:44.190871 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.190847 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:16:44.198442 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.198414 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5r6d9" Apr 24 21:16:44.205032 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.205006 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" Apr 24 21:16:44.211732 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.211704 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lhl59" Apr 24 21:16:44.216344 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.216323 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d7mjl" Apr 24 21:16:44.292284 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.292255 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:16:44.490469 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.490403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:44.490601 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:44.490544 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:44.490637 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:44.490619 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs podName:f3b8d86d-c179-4368-b025-0c7f41b2aa3e nodeName:}" failed. No retries permitted until 2026-04-24 21:16:45.49060344 +0000 UTC m=+4.091758982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs") pod "network-metrics-daemon-qm8hh" (UID: "f3b8d86d-c179-4368-b025-0c7f41b2aa3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:44.526569 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:44.526540 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc230c44d_8923_409d_a9e3_443872457536.slice/crio-eafc903b5b2c3c7081ebc26af9ebb88f9cd5117da9aeff83f986c84d628aaf87 WatchSource:0}: Error finding container eafc903b5b2c3c7081ebc26af9ebb88f9cd5117da9aeff83f986c84d628aaf87: Status 404 returned error can't find the container with id eafc903b5b2c3c7081ebc26af9ebb88f9cd5117da9aeff83f986c84d628aaf87 Apr 24 21:16:44.527759 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:44.527707 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda470a779_5d88_4004_a798_9ea3cbcbfe6d.slice/crio-914f04c5ed62d7d68046dbe1d548f7cdb53ca1296fadfcabdd920efc0dae60d3 WatchSource:0}: Error finding container 914f04c5ed62d7d68046dbe1d548f7cdb53ca1296fadfcabdd920efc0dae60d3: Status 404 returned error can't find the container with id 914f04c5ed62d7d68046dbe1d548f7cdb53ca1296fadfcabdd920efc0dae60d3 Apr 24 21:16:44.528502 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:44.528405 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3b4de04_a724_4231_a103_ae88c77beb64.slice/crio-3e20d53ff4716fff5fb4032771f90e21eb8716fe3ec02100c1ac275d11ef8850 WatchSource:0}: Error finding container 3e20d53ff4716fff5fb4032771f90e21eb8716fe3ec02100c1ac275d11ef8850: Status 404 returned error can't find the container with id 3e20d53ff4716fff5fb4032771f90e21eb8716fe3ec02100c1ac275d11ef8850 Apr 24 21:16:44.531866 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:44.531847 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cfc495b_d79a_4bed_94c7_6101136f82e7.slice/crio-81be13ac9052b13cbfebab0ce84aae6917d94b8fe752d1a7b4263ed9a1c8c2cf WatchSource:0}: Error finding container 81be13ac9052b13cbfebab0ce84aae6917d94b8fe752d1a7b4263ed9a1c8c2cf: Status 404 returned error can't find the container with id 81be13ac9052b13cbfebab0ce84aae6917d94b8fe752d1a7b4263ed9a1c8c2cf Apr 24 21:16:44.532731 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:44.532707 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b89c06b_ff11_4cc0_bd26_7f792a0f1702.slice/crio-8865a3adcd916781ce9d516b202297e31f976616a1b6672a48846635f0b0996d WatchSource:0}: Error finding container 8865a3adcd916781ce9d516b202297e31f976616a1b6672a48846635f0b0996d: Status 404 returned error can't find the container with id 8865a3adcd916781ce9d516b202297e31f976616a1b6672a48846635f0b0996d Apr 24 21:16:44.533583 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:44.533549 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd887b4fc_65dd_41bb_9013_2e633d3ab7a8.slice/crio-037526f55caa6a305ceee26ea49f312871fa23b048ad9278969b9578d23ea7d2 WatchSource:0}: Error finding container 037526f55caa6a305ceee26ea49f312871fa23b048ad9278969b9578d23ea7d2: Status 404 returned error can't find the container with id 037526f55caa6a305ceee26ea49f312871fa23b048ad9278969b9578d23ea7d2 Apr 24 21:16:44.534641 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:44.534454 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d51ef4d_9ece_4f73_b21d_de62e4a3b68e.slice/crio-7567507812a8ba5919047734ccb6bc20c4f2f0ea0cf41e3f786a1cd246ba0013 WatchSource:0}: Error finding container 7567507812a8ba5919047734ccb6bc20c4f2f0ea0cf41e3f786a1cd246ba0013: Status 404 returned error can't find the container with id 7567507812a8ba5919047734ccb6bc20c4f2f0ea0cf41e3f786a1cd246ba0013 Apr 24 21:16:44.535804 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:16:44.535690 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf0483bc_341e_4d8b_a4d1_0e5fb17d43a8.slice/crio-2a610fb011ba0c5a100c59b8639c2c5cfd19f9c68a1b4cdd10b4a86def086400 WatchSource:0}: Error finding container 2a610fb011ba0c5a100c59b8639c2c5cfd19f9c68a1b4cdd10b4a86def086400: Status 404 returned error can't find the container with id 2a610fb011ba0c5a100c59b8639c2c5cfd19f9c68a1b4cdd10b4a86def086400 Apr 24 21:16:44.591091 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.591040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2sr2\" (UniqueName: \"kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2\") pod \"network-check-target-qfs2f\" (UID: \"e42a906f-a474-4d11-8a0d-e8ef290b2e14\") " pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:44.591201 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:44.591172 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:44.591201 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:44.591192 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:44.591201 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:44.591201 2575 projected.go:194] Error preparing data for projected volume kube-api-access-b2sr2 for pod openshift-network-diagnostics/network-check-target-qfs2f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:44.591306 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:44.591252 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2 podName:e42a906f-a474-4d11-8a0d-e8ef290b2e14 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:45.591238408 +0000 UTC m=+4.192393950 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-b2sr2" (UniqueName: "kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2") pod "network-check-target-qfs2f" (UID: "e42a906f-a474-4d11-8a0d-e8ef290b2e14") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:44.984282 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.984181 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:11:42 +0000 UTC" deadline="2027-11-17 13:36:35.462938162 +0000 UTC" Apr 24 21:16:44.984282 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:44.984223 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13720h19m50.478718647s" Apr 24 21:16:45.001596 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:45.001408 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:45.001596 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:45.001544 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:16:45.015398 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:45.015358 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" event={"ID":"6551c028e13aff466c38397d8a508ac4","Type":"ContainerStarted","Data":"a99be4bf6c78f63d6ab1eb640da9851a2ecabf23703c89a4c686f3b19b6b4df2"} Apr 24 21:16:45.019596 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:45.019564 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5r6d9" event={"ID":"af0483bc-341e-4d8b-a4d1-0e5fb17d43a8","Type":"ContainerStarted","Data":"2a610fb011ba0c5a100c59b8639c2c5cfd19f9c68a1b4cdd10b4a86def086400"} Apr 24 21:16:45.024861 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:45.024831 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vlds8" event={"ID":"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e","Type":"ContainerStarted","Data":"7567507812a8ba5919047734ccb6bc20c4f2f0ea0cf41e3f786a1cd246ba0013"} Apr 24 21:16:45.028041 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:45.026193 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8829" event={"ID":"5b89c06b-ff11-4cc0-bd26-7f792a0f1702","Type":"ContainerStarted","Data":"8865a3adcd916781ce9d516b202297e31f976616a1b6672a48846635f0b0996d"} Apr 24 21:16:45.037971 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:45.037846 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lhl59" event={"ID":"a470a779-5d88-4004-a798-9ea3cbcbfe6d","Type":"ContainerStarted","Data":"914f04c5ed62d7d68046dbe1d548f7cdb53ca1296fadfcabdd920efc0dae60d3"} Apr 24 21:16:45.042650 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:45.042616 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d7mjl" event={"ID":"d887b4fc-65dd-41bb-9013-2e633d3ab7a8","Type":"ContainerStarted","Data":"037526f55caa6a305ceee26ea49f312871fa23b048ad9278969b9578d23ea7d2"} Apr 24 21:16:45.047159 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:45.047127 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" event={"ID":"8cfc495b-d79a-4bed-94c7-6101136f82e7","Type":"ContainerStarted","Data":"81be13ac9052b13cbfebab0ce84aae6917d94b8fe752d1a7b4263ed9a1c8c2cf"} Apr 24 21:16:45.054684 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:45.054631 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" event={"ID":"e3b4de04-a724-4231-a103-ae88c77beb64","Type":"ContainerStarted","Data":"3e20d53ff4716fff5fb4032771f90e21eb8716fe3ec02100c1ac275d11ef8850"} Apr 24 21:16:45.057953 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:45.057908 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8v98v" event={"ID":"c230c44d-8923-409d-a9e3-443872457536","Type":"ContainerStarted","Data":"eafc903b5b2c3c7081ebc26af9ebb88f9cd5117da9aeff83f986c84d628aaf87"} Apr 24 21:16:45.499174 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:45.499128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:45.499355 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:45.499321 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:45.499420 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:45.499386 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs podName:f3b8d86d-c179-4368-b025-0c7f41b2aa3e nodeName:}" failed. No retries permitted until 2026-04-24 21:16:47.499366543 +0000 UTC m=+6.100522094 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs") pod "network-metrics-daemon-qm8hh" (UID: "f3b8d86d-c179-4368-b025-0c7f41b2aa3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:45.602602 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:45.600171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2sr2\" (UniqueName: \"kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2\") pod \"network-check-target-qfs2f\" (UID: \"e42a906f-a474-4d11-8a0d-e8ef290b2e14\") " pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:45.602602 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:45.600365 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:45.602602 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:45.600391 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:45.602602 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:45.600406 2575 projected.go:194] Error preparing data for projected volume kube-api-access-b2sr2 for pod openshift-network-diagnostics/network-check-target-qfs2f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:45.602602 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:45.600476 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2 podName:e42a906f-a474-4d11-8a0d-e8ef290b2e14 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:47.600454321 +0000 UTC m=+6.201609863 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-b2sr2" (UniqueName: "kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2") pod "network-check-target-qfs2f" (UID: "e42a906f-a474-4d11-8a0d-e8ef290b2e14") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:46.004747 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:46.003778 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:46.004747 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:46.003921 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:16:46.074109 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:46.073536 2575 generic.go:358] "Generic (PLEG): container finished" podID="ea9c91cb8023c6edf0f87546014505a5" containerID="9d4a7b63572b3851603c55b8666ab264963830225394a4ca7c8f4f3a5c6b925e" exitCode=0 Apr 24 21:16:46.074981 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:46.074930 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" event={"ID":"ea9c91cb8023c6edf0f87546014505a5","Type":"ContainerDied","Data":"9d4a7b63572b3851603c55b8666ab264963830225394a4ca7c8f4f3a5c6b925e"} Apr 24 21:16:46.096256 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:46.095169 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-46.ec2.internal" podStartSLOduration=3.095146107 podStartE2EDuration="3.095146107s" podCreationTimestamp="2026-04-24 21:16:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:45.038137319 +0000 UTC m=+3.639292873" watchObservedRunningTime="2026-04-24 21:16:46.095146107 +0000 UTC m=+4.696301669" Apr 24 21:16:47.001463 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:47.001431 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:47.001638 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:47.001557 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:16:47.086096 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:47.086037 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" event={"ID":"ea9c91cb8023c6edf0f87546014505a5","Type":"ContainerStarted","Data":"ba479293c5b17c71160aeb324cca27d5d1a2d51467177ec67714359e3e25a448"} Apr 24 21:16:47.520712 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:47.520666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:47.521318 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:47.520888 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:47.521318 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:47.520960 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs podName:f3b8d86d-c179-4368-b025-0c7f41b2aa3e nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.520939711 +0000 UTC m=+10.122095255 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs") pod "network-metrics-daemon-qm8hh" (UID: "f3b8d86d-c179-4368-b025-0c7f41b2aa3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:47.621788 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:47.621084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2sr2\" (UniqueName: \"kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2\") pod \"network-check-target-qfs2f\" (UID: \"e42a906f-a474-4d11-8a0d-e8ef290b2e14\") " pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:47.621788 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:47.621289 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:47.621788 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:47.621307 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:47.621788 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:47.621320 2575 projected.go:194] Error preparing data for projected volume kube-api-access-b2sr2 for pod openshift-network-diagnostics/network-check-target-qfs2f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:47.621788 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:47.621375 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2 podName:e42a906f-a474-4d11-8a0d-e8ef290b2e14 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:51.621357376 +0000 UTC m=+10.222512923 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-b2sr2" (UniqueName: "kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2") pod "network-check-target-qfs2f" (UID: "e42a906f-a474-4d11-8a0d-e8ef290b2e14") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:48.001993 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:48.001913 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:48.002178 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:48.002055 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:16:49.001645 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:49.001608 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:49.002137 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:49.001759 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:16:50.002525 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:50.002289 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:50.002525 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:50.002444 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:16:51.001254 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:51.001213 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:51.001435 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:51.001353 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:16:51.556404 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:51.556361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:51.556870 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:51.556547 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:51.556870 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:51.556634 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs podName:f3b8d86d-c179-4368-b025-0c7f41b2aa3e nodeName:}" failed. No retries permitted until 2026-04-24 21:16:59.556612849 +0000 UTC m=+18.157768389 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs") pod "network-metrics-daemon-qm8hh" (UID: "f3b8d86d-c179-4368-b025-0c7f41b2aa3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:51.657415 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:51.657302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2sr2\" (UniqueName: \"kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2\") pod \"network-check-target-qfs2f\" (UID: \"e42a906f-a474-4d11-8a0d-e8ef290b2e14\") " pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:51.657584 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:51.657506 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:51.657584 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:51.657534 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:51.657584 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:51.657549 2575 projected.go:194] Error preparing data for projected volume kube-api-access-b2sr2 for pod openshift-network-diagnostics/network-check-target-qfs2f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:51.657754 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:51.657615 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2 podName:e42a906f-a474-4d11-8a0d-e8ef290b2e14 nodeName:}" failed. No retries permitted until 2026-04-24 21:16:59.65759451 +0000 UTC m=+18.258750056 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-b2sr2" (UniqueName: "kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2") pod "network-check-target-qfs2f" (UID: "e42a906f-a474-4d11-8a0d-e8ef290b2e14") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:52.003226 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:52.002760 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:52.003226 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:52.002878 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:16:52.714494 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:52.714446 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-46.ec2.internal" podStartSLOduration=9.714427072 podStartE2EDuration="9.714427072s" podCreationTimestamp="2026-04-24 21:16:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:16:47.10001557 +0000 UTC m=+5.701171133" watchObservedRunningTime="2026-04-24 21:16:52.714427072 +0000 UTC m=+11.315582635" Apr 24 21:16:52.715253 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:52.715219 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fvzdq"] Apr 24 21:16:52.719492 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:52.719469 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:52.719624 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:52.719553 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:16:52.767523 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:52.767474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:52.767686 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:52.767611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-dbus\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:52.767686 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:52.767657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-kubelet-config\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:52.868429 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:52.868379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-dbus\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:52.868611 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:52.868451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-kubelet-config\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:52.868611 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:52.868492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:52.868611 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:52.868565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-dbus\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:52.868803 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:52.868620 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:52.868803 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:52.868637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-kubelet-config\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:52.868803 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:52.868690 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret podName:527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:53.368660817 +0000 UTC m=+11.969816367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret") pod "global-pull-secret-syncer-fvzdq" (UID: "527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:53.001480 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:53.001381 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:53.001641 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:53.001515 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:16:53.372795 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:53.372707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:53.372966 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:53.372860 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:53.372966 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:53.372933 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret podName:527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:54.372912705 +0000 UTC m=+12.974068261 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret") pod "global-pull-secret-syncer-fvzdq" (UID: "527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:54.001509 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:54.001471 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:54.001983 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:54.001490 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:54.001983 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:54.001616 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:16:54.001983 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:54.001661 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:16:54.381699 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:54.381602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:54.381840 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:54.381755 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:54.381840 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:54.381839 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret podName:527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a nodeName:}" failed. No retries permitted until 2026-04-24 21:16:56.381816296 +0000 UTC m=+14.982971850 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret") pod "global-pull-secret-syncer-fvzdq" (UID: "527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:55.001777 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:55.001742 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:55.002219 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:55.001852 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:16:55.870235 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:55.870204 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-rbbz9"] Apr 24 21:16:55.873872 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:55.873849 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rbbz9" Apr 24 21:16:55.880279 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:55.880250 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:16:55.880855 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:55.880835 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4rdlg\"" Apr 24 21:16:55.881206 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:55.881188 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:16:55.995199 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:55.995162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/aae051c8-9a36-4936-b30a-80aff28bff26-hosts-file\") pod \"node-resolver-rbbz9\" (UID: \"aae051c8-9a36-4936-b30a-80aff28bff26\") " pod="openshift-dns/node-resolver-rbbz9" Apr 24 21:16:55.995373 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:55.995215 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aae051c8-9a36-4936-b30a-80aff28bff26-tmp-dir\") pod \"node-resolver-rbbz9\" (UID: \"aae051c8-9a36-4936-b30a-80aff28bff26\") " pod="openshift-dns/node-resolver-rbbz9" Apr 24 21:16:55.995373 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:55.995294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrkt5\" (UniqueName: \"kubernetes.io/projected/aae051c8-9a36-4936-b30a-80aff28bff26-kube-api-access-mrkt5\") pod \"node-resolver-rbbz9\" (UID: \"aae051c8-9a36-4936-b30a-80aff28bff26\") " pod="openshift-dns/node-resolver-rbbz9" Apr 24 21:16:56.002252 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:56.002222 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:56.002650 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:56.002256 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:56.002650 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:56.002360 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:16:56.002650 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:56.002508 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:16:56.095818 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:56.095779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aae051c8-9a36-4936-b30a-80aff28bff26-tmp-dir\") pod \"node-resolver-rbbz9\" (UID: \"aae051c8-9a36-4936-b30a-80aff28bff26\") " pod="openshift-dns/node-resolver-rbbz9" Apr 24 21:16:56.095818 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:56.095829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrkt5\" (UniqueName: \"kubernetes.io/projected/aae051c8-9a36-4936-b30a-80aff28bff26-kube-api-access-mrkt5\") pod \"node-resolver-rbbz9\" (UID: \"aae051c8-9a36-4936-b30a-80aff28bff26\") " pod="openshift-dns/node-resolver-rbbz9" Apr 24 21:16:56.096078 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:56.095899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/aae051c8-9a36-4936-b30a-80aff28bff26-hosts-file\") pod \"node-resolver-rbbz9\" (UID: \"aae051c8-9a36-4936-b30a-80aff28bff26\") " pod="openshift-dns/node-resolver-rbbz9" Apr 24 21:16:56.096078 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:56.095975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/aae051c8-9a36-4936-b30a-80aff28bff26-hosts-file\") pod \"node-resolver-rbbz9\" (UID: \"aae051c8-9a36-4936-b30a-80aff28bff26\") " pod="openshift-dns/node-resolver-rbbz9" Apr 24 21:16:56.096184 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:56.096132 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aae051c8-9a36-4936-b30a-80aff28bff26-tmp-dir\") pod \"node-resolver-rbbz9\" (UID: \"aae051c8-9a36-4936-b30a-80aff28bff26\") " pod="openshift-dns/node-resolver-rbbz9" Apr 24 21:16:56.107147 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:56.107116 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrkt5\" (UniqueName: \"kubernetes.io/projected/aae051c8-9a36-4936-b30a-80aff28bff26-kube-api-access-mrkt5\") pod \"node-resolver-rbbz9\" (UID: \"aae051c8-9a36-4936-b30a-80aff28bff26\") " pod="openshift-dns/node-resolver-rbbz9" Apr 24 21:16:56.183105 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:56.183006 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rbbz9" Apr 24 21:16:56.398587 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:56.398550 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:56.398745 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:56.398696 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:56.398842 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:56.398772 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret podName:527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a nodeName:}" failed. No retries permitted until 2026-04-24 21:17:00.398751461 +0000 UTC m=+18.999907011 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret") pod "global-pull-secret-syncer-fvzdq" (UID: "527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:16:57.001386 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:57.001299 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:57.001549 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:57.001414 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:16:58.001657 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:58.001617 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:58.002157 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:58.001617 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:16:58.002157 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:58.001783 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:16:58.002157 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:58.001826 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:16:59.002337 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:59.002124 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:59.002882 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:59.002439 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:16:59.622567 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:59.622532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:16:59.622736 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:59.622653 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:59.622736 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:59.622705 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs podName:f3b8d86d-c179-4368-b025-0c7f41b2aa3e nodeName:}" failed. No retries permitted until 2026-04-24 21:17:15.622690776 +0000 UTC m=+34.223846318 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs") pod "network-metrics-daemon-qm8hh" (UID: "f3b8d86d-c179-4368-b025-0c7f41b2aa3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:16:59.723227 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:16:59.723190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2sr2\" (UniqueName: \"kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2\") pod \"network-check-target-qfs2f\" (UID: \"e42a906f-a474-4d11-8a0d-e8ef290b2e14\") " pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:16:59.723404 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:59.723344 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:16:59.723404 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:59.723362 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:16:59.723404 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:59.723374 2575 projected.go:194] Error preparing data for projected volume kube-api-access-b2sr2 for pod openshift-network-diagnostics/network-check-target-qfs2f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:16:59.723528 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:16:59.723435 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2 podName:e42a906f-a474-4d11-8a0d-e8ef290b2e14 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:15.723416386 +0000 UTC m=+34.324571929 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-b2sr2" (UniqueName: "kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2") pod "network-check-target-qfs2f" (UID: "e42a906f-a474-4d11-8a0d-e8ef290b2e14") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:00.002221 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:00.002133 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:00.002390 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:00.002257 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:17:00.002390 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:00.002321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:00.002745 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:00.002415 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:17:00.428448 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:00.428355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:00.428600 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:00.428511 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:17:00.428600 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:00.428579 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret podName:527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a nodeName:}" failed. No retries permitted until 2026-04-24 21:17:08.428564225 +0000 UTC m=+27.029719769 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret") pod "global-pull-secret-syncer-fvzdq" (UID: "527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:17:01.001759 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:01.001722 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:01.001917 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:01.001854 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:17:01.188966 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:01.188936 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae051c8_9a36_4936_b30a_80aff28bff26.slice/crio-96d58614ae489e4532b13dc3f8050b2bddd884ccd655acd54cb84dd1820a9d8e WatchSource:0}: Error finding container 96d58614ae489e4532b13dc3f8050b2bddd884ccd655acd54cb84dd1820a9d8e: Status 404 returned error can't find the container with id 96d58614ae489e4532b13dc3f8050b2bddd884ccd655acd54cb84dd1820a9d8e Apr 24 21:17:02.003294 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.003002 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:02.003440 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.003099 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:02.003440 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:02.003378 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:17:02.003522 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:02.003469 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:17:02.115557 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.115518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d7mjl" event={"ID":"d887b4fc-65dd-41bb-9013-2e633d3ab7a8","Type":"ContainerStarted","Data":"ef980399bce3316eff5afa640a3a42c847a6981daad6e0ac4b7d7d05253b132b"} Apr 24 21:17:02.117129 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.117096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" event={"ID":"8cfc495b-d79a-4bed-94c7-6101136f82e7","Type":"ContainerStarted","Data":"88fade7bcd1499201d68953b4c1445600129b60e8a526f0f12e4fd95f5e4b142"} Apr 24 21:17:02.120035 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.120015 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:17:02.120501 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.120474 2575 generic.go:358] "Generic (PLEG): container finished" podID="e3b4de04-a724-4231-a103-ae88c77beb64" containerID="3ce1ad1cae8e98f18cc47c1dd97f93b3f560262eded134fa634af8309a45a8ee" exitCode=1 Apr 24 21:17:02.120598 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.120539 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" event={"ID":"e3b4de04-a724-4231-a103-ae88c77beb64","Type":"ContainerStarted","Data":"924cd26a8e460f9fe13607ca41e98c1c2dd73125d2dcc6af70fc9f98af0964fd"} Apr 24 21:17:02.120598 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.120565 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" event={"ID":"e3b4de04-a724-4231-a103-ae88c77beb64","Type":"ContainerStarted","Data":"75289a260f720e4cf94a64ba5eb8df8e8f4869b5ee8cd37e1a5af052f865c1cc"} Apr 24 21:17:02.120598 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.120580 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" event={"ID":"e3b4de04-a724-4231-a103-ae88c77beb64","Type":"ContainerStarted","Data":"cb57bdfff24282eac82696c1ed9d42595b6ead5f4fda85b48486f300e1f198e0"} Apr 24 21:17:02.120773 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.120600 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" event={"ID":"e3b4de04-a724-4231-a103-ae88c77beb64","Type":"ContainerStarted","Data":"39d8eb2e7049ccb6b59ab76140a9be46c975384ee8a4fc7f77b150a7bdb307f1"} Apr 24 21:17:02.120773 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.120612 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" event={"ID":"e3b4de04-a724-4231-a103-ae88c77beb64","Type":"ContainerDied","Data":"3ce1ad1cae8e98f18cc47c1dd97f93b3f560262eded134fa634af8309a45a8ee"} Apr 24 21:17:02.120773 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.120623 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" event={"ID":"e3b4de04-a724-4231-a103-ae88c77beb64","Type":"ContainerStarted","Data":"62b957cdb0e662ada64be6a4fd6dba9291e9a369b5d07641f37b51ada8d62ba1"} Apr 24 21:17:02.122364 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.122337 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rbbz9" event={"ID":"aae051c8-9a36-4936-b30a-80aff28bff26","Type":"ContainerStarted","Data":"a55642d8c33e6c797e4a71e0a6132d79fdc534833b6acae41172141bf5b96692"} Apr 24 21:17:02.122489 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.122373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rbbz9" event={"ID":"aae051c8-9a36-4936-b30a-80aff28bff26","Type":"ContainerStarted","Data":"96d58614ae489e4532b13dc3f8050b2bddd884ccd655acd54cb84dd1820a9d8e"} Apr 24 21:17:02.123817 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.123792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5r6d9" event={"ID":"af0483bc-341e-4d8b-a4d1-0e5fb17d43a8","Type":"ContainerStarted","Data":"006f4f443611112067630675a049a5afa94f3219142bda254df57b63896fa23a"} Apr 24 21:17:02.125785 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.125757 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vlds8" event={"ID":"2d51ef4d-9ece-4f73-b21d-de62e4a3b68e","Type":"ContainerStarted","Data":"f21c389c78a2034338d084bd872427db8136a883b2a53b243ac6a1f011b55b65"} Apr 24 21:17:02.127403 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.127367 2575 generic.go:358] "Generic (PLEG): container finished" podID="5b89c06b-ff11-4cc0-bd26-7f792a0f1702" containerID="c3ba0ddbe7f0828536007be9341993bfa2e8c99e476a06e1aad76b780d432890" exitCode=0 Apr 24 21:17:02.127505 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.127438 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8829" event={"ID":"5b89c06b-ff11-4cc0-bd26-7f792a0f1702","Type":"ContainerDied","Data":"c3ba0ddbe7f0828536007be9341993bfa2e8c99e476a06e1aad76b780d432890"} Apr 24 21:17:02.129143 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.129090 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d7mjl" podStartSLOduration=8.103117877 podStartE2EDuration="20.129054582s" podCreationTimestamp="2026-04-24 21:16:42 +0000 UTC" firstStartedPulling="2026-04-24 21:16:44.535344169 +0000 UTC m=+3.136499708" lastFinishedPulling="2026-04-24 21:16:56.561280871 +0000 UTC m=+15.162436413" observedRunningTime="2026-04-24 21:17:02.128965301 +0000 UTC m=+20.730120863" watchObservedRunningTime="2026-04-24 21:17:02.129054582 +0000 UTC m=+20.730210144" Apr 24 21:17:02.129221 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.129163 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lhl59" event={"ID":"a470a779-5d88-4004-a798-9ea3cbcbfe6d","Type":"ContainerStarted","Data":"db478bbcd8498cd2519656bf6c95c2e52fbded3921b75795bacb35406b132442"} Apr 24 21:17:02.144222 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.144161 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rbbz9" podStartSLOduration=7.144140686 podStartE2EDuration="7.144140686s" podCreationTimestamp="2026-04-24 21:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:02.143601821 +0000 UTC m=+20.744757383" watchObservedRunningTime="2026-04-24 21:17:02.144140686 +0000 UTC m=+20.745296248" Apr 24 21:17:02.181724 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.181671 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-5r6d9" podStartSLOduration=3.584284148 podStartE2EDuration="20.181655897s" podCreationTimestamp="2026-04-24 21:16:42 +0000 UTC" firstStartedPulling="2026-04-24 21:16:44.538048498 +0000 UTC m=+3.139204038" lastFinishedPulling="2026-04-24 21:17:01.135420232 +0000 UTC m=+19.736575787" observedRunningTime="2026-04-24 21:17:02.181131938 +0000 UTC m=+20.782287500" watchObservedRunningTime="2026-04-24 21:17:02.181655897 +0000 UTC m=+20.782811458" Apr 24 21:17:02.198053 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.197970 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vlds8" podStartSLOduration=3.5472064530000003 podStartE2EDuration="20.197951625s" podCreationTimestamp="2026-04-24 21:16:42 +0000 UTC" firstStartedPulling="2026-04-24 21:16:44.537020333 +0000 UTC m=+3.138175877" lastFinishedPulling="2026-04-24 21:17:01.187765496 +0000 UTC m=+19.788921049" observedRunningTime="2026-04-24 21:17:02.197682571 +0000 UTC m=+20.798838150" watchObservedRunningTime="2026-04-24 21:17:02.197951625 +0000 UTC m=+20.799107186" Apr 24 21:17:02.216783 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.216734 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lhl59" podStartSLOduration=3.570283925 podStartE2EDuration="20.216717575s" podCreationTimestamp="2026-04-24 21:16:42 +0000 UTC" firstStartedPulling="2026-04-24 21:16:44.529979068 +0000 UTC m=+3.131134610" lastFinishedPulling="2026-04-24 21:17:01.176412706 +0000 UTC m=+19.777568260" observedRunningTime="2026-04-24 21:17:02.216347855 +0000 UTC m=+20.817503415" watchObservedRunningTime="2026-04-24 21:17:02.216717575 +0000 UTC m=+20.817873135" Apr 24 21:17:02.547791 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.547548 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:17:02.636930 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.636839 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-5r6d9" Apr 24 21:17:02.935278 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.935112 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:17:02.547785143Z","UUID":"3a27d64f-af81-4266-81d8-36e488e9a381","Handler":null,"Name":"","Endpoint":""} Apr 24 21:17:02.936900 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.936875 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:17:02.937034 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:02.936909 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:17:03.001972 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:03.001943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:03.002171 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:03.002088 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:17:03.133168 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:03.133133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8v98v" event={"ID":"c230c44d-8923-409d-a9e3-443872457536","Type":"ContainerStarted","Data":"e03091322e7221329a5236718111baf9ea8ae85028a33d91633c3632cc67f426"} Apr 24 21:17:03.134990 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:03.134961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" event={"ID":"8cfc495b-d79a-4bed-94c7-6101136f82e7","Type":"ContainerStarted","Data":"afe5c19ce71e91fe79827d04d255682adaa11b8162100b234d380ece82204427"} Apr 24 21:17:03.148871 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:03.148818 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8v98v" podStartSLOduration=4.542202751 podStartE2EDuration="21.148803058s" podCreationTimestamp="2026-04-24 21:16:42 +0000 UTC" firstStartedPulling="2026-04-24 21:16:44.528828651 +0000 UTC m=+3.129984195" lastFinishedPulling="2026-04-24 21:17:01.13542896 +0000 UTC m=+19.736584502" observedRunningTime="2026-04-24 21:17:03.148507047 +0000 UTC m=+21.749662608" watchObservedRunningTime="2026-04-24 21:17:03.148803058 +0000 UTC m=+21.749958619" Apr 24 21:17:03.615286 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:03.615257 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-5r6d9" Apr 24 21:17:03.616450 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:03.616428 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-5r6d9" Apr 24 21:17:04.001393 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:04.001303 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:04.001393 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:04.001319 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:04.001611 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:04.001427 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:17:04.001611 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:04.001562 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:17:04.140277 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:04.140195 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" event={"ID":"8cfc495b-d79a-4bed-94c7-6101136f82e7","Type":"ContainerStarted","Data":"b7a71ca5fc63c34a8f8373d8392f090280a4e6cc2eac3f7ef66e843e7fd0b3a5"} Apr 24 21:17:04.141153 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:04.141128 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-5r6d9" Apr 24 21:17:04.158579 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:04.158539 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zqst4" podStartSLOduration=3.078191862 podStartE2EDuration="22.158524631s" podCreationTimestamp="2026-04-24 21:16:42 +0000 UTC" firstStartedPulling="2026-04-24 21:16:44.533624149 +0000 UTC m=+3.134779702" lastFinishedPulling="2026-04-24 21:17:03.613956921 +0000 UTC m=+22.215112471" observedRunningTime="2026-04-24 21:17:04.158379779 +0000 UTC m=+22.759535354" watchObservedRunningTime="2026-04-24 21:17:04.158524631 +0000 UTC m=+22.759680191" Apr 24 21:17:05.001611 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:05.001579 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:05.002278 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:05.001702 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:17:05.147205 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:05.147173 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:17:05.147616 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:05.147582 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" event={"ID":"e3b4de04-a724-4231-a103-ae88c77beb64","Type":"ContainerStarted","Data":"eb2b6bd0a999e3df4a277353ca859df839d0fafa62cd806f3fd7999773678006"} Apr 24 21:17:06.001599 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:06.001565 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:06.001783 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:06.001703 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:17:06.001783 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:06.001750 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:06.002230 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:06.001866 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:17:07.002267 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:07.002059 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:07.002751 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:07.002342 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:17:07.153040 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:07.153000 2575 generic.go:358] "Generic (PLEG): container finished" podID="5b89c06b-ff11-4cc0-bd26-7f792a0f1702" containerID="02abb72758197c3d80730143d6ee076f38ef1fade6ca0cacb6ea13b9a38506f4" exitCode=0 Apr 24 21:17:07.153214 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:07.153098 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8829" event={"ID":"5b89c06b-ff11-4cc0-bd26-7f792a0f1702","Type":"ContainerDied","Data":"02abb72758197c3d80730143d6ee076f38ef1fade6ca0cacb6ea13b9a38506f4"} Apr 24 21:17:07.156057 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:07.156017 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:17:07.156357 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:07.156338 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" event={"ID":"e3b4de04-a724-4231-a103-ae88c77beb64","Type":"ContainerStarted","Data":"98bae297c22ed9030a11dffe5b73853d0581a00d0b6127614d7169fddb7eab2c"} Apr 24 21:17:07.156662 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:07.156647 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:17:07.156833 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:07.156817 2575 scope.go:117] "RemoveContainer" containerID="3ce1ad1cae8e98f18cc47c1dd97f93b3f560262eded134fa634af8309a45a8ee" Apr 24 21:17:07.172969 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:07.172949 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:17:08.004308 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.004282 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:08.004679 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.004309 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:08.004679 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:08.004385 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:17:08.004679 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:08.004521 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:17:08.159795 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.159606 2575 generic.go:358] "Generic (PLEG): container finished" podID="5b89c06b-ff11-4cc0-bd26-7f792a0f1702" containerID="b2ad17e1ce49ff57f3d29d4bc02688e296da97d7cb72c617863a074bda11a239" exitCode=0 Apr 24 21:17:08.159944 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.159639 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8829" event={"ID":"5b89c06b-ff11-4cc0-bd26-7f792a0f1702","Type":"ContainerDied","Data":"b2ad17e1ce49ff57f3d29d4bc02688e296da97d7cb72c617863a074bda11a239"} Apr 24 21:17:08.163392 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.163372 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:17:08.163699 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.163679 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" event={"ID":"e3b4de04-a724-4231-a103-ae88c77beb64","Type":"ContainerStarted","Data":"a3f051b75d11e1f05756a97d6ba7ded8244a9227dd193e215f17fb9a509939ad"} Apr 24 21:17:08.163900 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.163883 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:17:08.163963 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.163912 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:17:08.178679 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.178655 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:17:08.233399 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.233347 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" podStartSLOduration=9.382174291 podStartE2EDuration="26.233333368s" podCreationTimestamp="2026-04-24 21:16:42 +0000 UTC" firstStartedPulling="2026-04-24 21:16:44.530603419 +0000 UTC m=+3.131758959" lastFinishedPulling="2026-04-24 21:17:01.381762483 +0000 UTC m=+19.982918036" observedRunningTime="2026-04-24 21:17:08.232985905 +0000 UTC m=+26.834141465" watchObservedRunningTime="2026-04-24 21:17:08.233333368 +0000 UTC m=+26.834488928" Apr 24 21:17:08.320898 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.320867 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qfs2f"] Apr 24 21:17:08.321058 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.320994 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:08.321169 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:08.321095 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:17:08.326665 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.326643 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fvzdq"] Apr 24 21:17:08.326781 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.326767 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:08.326883 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:08.326860 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:17:08.331284 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.331256 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qm8hh"] Apr 24 21:17:08.331396 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.331384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:08.331519 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:08.331498 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:17:08.489296 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:08.489205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:08.489454 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:08.489374 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:17:08.489454 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:08.489451 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret podName:527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a nodeName:}" failed. No retries permitted until 2026-04-24 21:17:24.489433224 +0000 UTC m=+43.090588774 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret") pod "global-pull-secret-syncer-fvzdq" (UID: "527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:17:09.167532 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:09.167448 2575 generic.go:358] "Generic (PLEG): container finished" podID="5b89c06b-ff11-4cc0-bd26-7f792a0f1702" containerID="61f6bd43a7951e2de731106c999f662c9dd16399e31541dabda9edc852356d1d" exitCode=0 Apr 24 21:17:09.168153 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:09.167528 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8829" event={"ID":"5b89c06b-ff11-4cc0-bd26-7f792a0f1702","Type":"ContainerDied","Data":"61f6bd43a7951e2de731106c999f662c9dd16399e31541dabda9edc852356d1d"} Apr 24 21:17:10.005154 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:10.005123 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:10.005404 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:10.005123 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:10.005404 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:10.005250 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:17:10.005404 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:10.005332 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:17:10.005404 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:10.005123 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:10.005539 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:10.005417 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:17:11.352220 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:11.352189 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rbbz9_aae051c8-9a36-4936-b30a-80aff28bff26/dns-node-resolver/0.log" Apr 24 21:17:12.005620 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:12.005584 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:12.005809 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:12.005707 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:17:12.006191 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:12.006158 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:12.006316 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:12.006258 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:17:12.006384 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:12.006324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:12.006436 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:12.006388 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:17:12.335537 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:12.335506 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d7mjl_d887b4fc-65dd-41bb-9013-2e633d3ab7a8/node-ca/0.log" Apr 24 21:17:14.001835 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.001802 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:14.001835 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.001839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:14.002352 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.001802 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:14.002352 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.001947 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qm8hh" podUID="f3b8d86d-c179-4368-b025-0c7f41b2aa3e" Apr 24 21:17:14.002352 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.002049 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fvzdq" podUID="527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a" Apr 24 21:17:14.002352 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.002174 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qfs2f" podUID="e42a906f-a474-4d11-8a0d-e8ef290b2e14" Apr 24 21:17:14.148266 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.148189 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-46.ec2.internal" event="NodeReady" Apr 24 21:17:14.148419 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.148352 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:17:14.189550 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.189517 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7649b9b64b-xms8x"] Apr 24 21:17:14.220726 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.220698 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4jr7n"] Apr 24 21:17:14.220939 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.220914 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.224511 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.224481 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:17:14.224671 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.224579 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:17:14.224763 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.224740 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dqkgc\"" Apr 24 21:17:14.225746 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.225725 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:17:14.229578 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.229557 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:17:14.236508 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.236485 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7649b9b64b-xms8x"] Apr 24 21:17:14.236658 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.236516 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-l2wft"] Apr 24 21:17:14.236658 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.236584 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.238905 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.238883 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:17:14.239221 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.239204 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9vgbs\"" Apr 24 21:17:14.239558 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.239536 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:17:14.260560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.260531 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4jr7n"] Apr 24 21:17:14.260560 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.260568 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l2wft"] Apr 24 21:17:14.260787 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.260582 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-z8ffq"] Apr 24 21:17:14.260787 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.260754 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.263332 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.263305 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:17:14.263473 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.263342 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-26gnc\"" Apr 24 21:17:14.263634 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.263616 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:17:14.263711 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.263690 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:17:14.263811 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.263791 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:17:14.276480 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.276456 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z8ffq"] Apr 24 21:17:14.276480 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.276471 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z8ffq" Apr 24 21:17:14.278735 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.278702 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:17:14.278861 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.278844 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:17:14.278972 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.278951 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6g56g\"" Apr 24 21:17:14.279098 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.278957 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:17:14.335202 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.335393 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-data-volume\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.335393 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335274 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66ff2b4e-2b04-4b4e-9761-f1430f4296de-trusted-ca\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.335393 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66ff2b4e-2b04-4b4e-9761-f1430f4296de-installation-pull-secrets\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.335393 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335321 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/22015fa1-7a0e-402e-b16d-c8403c4becda-tmp-dir\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.335393 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335346 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-crio-socket\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.335654 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335407 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-certificates\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.335654 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.335654 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66ff2b4e-2b04-4b4e-9761-f1430f4296de-ca-trust-extracted\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.335654 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22015fa1-7a0e-402e-b16d-c8403c4becda-config-volume\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.335654 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb59g\" (UniqueName: \"kubernetes.io/projected/22015fa1-7a0e-402e-b16d-c8403c4becda-kube-api-access-kb59g\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.335654 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335638 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/66ff2b4e-2b04-4b4e-9761-f1430f4296de-image-registry-private-configuration\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.335941 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-bound-sa-token\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.335941 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.335941 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335794 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.335941 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgmhq\" (UniqueName: \"kubernetes.io/projected/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-kube-api-access-hgmhq\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.335941 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.335845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjhx9\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-kube-api-access-zjhx9\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.436727 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.436642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-certificates\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.436893 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.436737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.436893 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.436786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert\") pod \"ingress-canary-z8ffq\" (UID: \"ccd8485f-5d27-4b5c-8155-58281c06943e\") " pod="openshift-ingress-canary/ingress-canary-z8ffq" Apr 24 21:17:14.436893 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.436841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66ff2b4e-2b04-4b4e-9761-f1430f4296de-ca-trust-extracted\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.437052 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.436887 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:14.437052 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.436911 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7649b9b64b-xms8x: secret "image-registry-tls" not found Apr 24 21:17:14.437052 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.436891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22015fa1-7a0e-402e-b16d-c8403c4becda-config-volume\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.437204 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb59g\" (UniqueName: \"kubernetes.io/projected/22015fa1-7a0e-402e-b16d-c8403c4becda-kube-api-access-kb59g\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.437204 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/66ff2b4e-2b04-4b4e-9761-f1430f4296de-image-registry-private-configuration\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.437204 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-bound-sa-token\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.437204 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.437405 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.437405 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgmhq\" (UniqueName: \"kubernetes.io/projected/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-kube-api-access-hgmhq\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.437405 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66ff2b4e-2b04-4b4e-9761-f1430f4296de-ca-trust-extracted\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.437405 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.437268 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls podName:66ff2b4e-2b04-4b4e-9761-f1430f4296de nodeName:}" failed. No retries permitted until 2026-04-24 21:17:14.937249242 +0000 UTC m=+33.538404781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls") pod "image-registry-7649b9b64b-xms8x" (UID: "66ff2b4e-2b04-4b4e-9761-f1430f4296de") : secret "image-registry-tls" not found Apr 24 21:17:14.437405 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhx9\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-kube-api-access-zjhx9\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.437405 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.437405 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.437334 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:17:14.437405 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437338 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8btp\" (UniqueName: \"kubernetes.io/projected/ccd8485f-5d27-4b5c-8155-58281c06943e-kube-api-access-t8btp\") pod \"ingress-canary-z8ffq\" (UID: \"ccd8485f-5d27-4b5c-8155-58281c06943e\") " pod="openshift-ingress-canary/ingress-canary-z8ffq" Apr 24 21:17:14.437405 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.437334 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:17:14.437863 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437438 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22015fa1-7a0e-402e-b16d-c8403c4becda-config-volume\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.437863 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-data-volume\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.437863 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66ff2b4e-2b04-4b4e-9761-f1430f4296de-trusted-ca\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.437863 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.437506 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls podName:a78c9c94-bf81-4cf1-9862-dd3d48a12eba nodeName:}" failed. No retries permitted until 2026-04-24 21:17:14.937490462 +0000 UTC m=+33.538646015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls") pod "insights-runtime-extractor-l2wft" (UID: "a78c9c94-bf81-4cf1-9862-dd3d48a12eba") : secret "insights-runtime-extractor-tls" not found Apr 24 21:17:14.437863 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437511 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-certificates\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.437863 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.437541 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls podName:22015fa1-7a0e-402e-b16d-c8403c4becda nodeName:}" failed. No retries permitted until 2026-04-24 21:17:14.93752879 +0000 UTC m=+33.538684333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls") pod "dns-default-4jr7n" (UID: "22015fa1-7a0e-402e-b16d-c8403c4becda") : secret "dns-default-metrics-tls" not found Apr 24 21:17:14.437863 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66ff2b4e-2b04-4b4e-9761-f1430f4296de-installation-pull-secrets\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.437863 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/22015fa1-7a0e-402e-b16d-c8403c4becda-tmp-dir\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.437863 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437597 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-data-volume\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.437863 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/22015fa1-7a0e-402e-b16d-c8403c4becda-tmp-dir\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.438332 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.438332 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.437982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-crio-socket\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.438332 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.438120 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-crio-socket\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.438974 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.438788 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66ff2b4e-2b04-4b4e-9761-f1430f4296de-trusted-ca\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.443395 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.443371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/66ff2b4e-2b04-4b4e-9761-f1430f4296de-image-registry-private-configuration\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.443534 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.443401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66ff2b4e-2b04-4b4e-9761-f1430f4296de-installation-pull-secrets\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.451030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.451008 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-bound-sa-token\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.451350 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.451334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhx9\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-kube-api-access-zjhx9\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.451894 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.451863 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgmhq\" (UniqueName: \"kubernetes.io/projected/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-kube-api-access-hgmhq\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.452137 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.452119 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb59g\" (UniqueName: \"kubernetes.io/projected/22015fa1-7a0e-402e-b16d-c8403c4becda-kube-api-access-kb59g\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.539548 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.539445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert\") pod \"ingress-canary-z8ffq\" (UID: \"ccd8485f-5d27-4b5c-8155-58281c06943e\") " pod="openshift-ingress-canary/ingress-canary-z8ffq" Apr 24 21:17:14.539752 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.539636 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:14.539752 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.539708 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert podName:ccd8485f-5d27-4b5c-8155-58281c06943e nodeName:}" failed. No retries permitted until 2026-04-24 21:17:15.039687019 +0000 UTC m=+33.640842574 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert") pod "ingress-canary-z8ffq" (UID: "ccd8485f-5d27-4b5c-8155-58281c06943e") : secret "canary-serving-cert" not found Apr 24 21:17:14.542331 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.540153 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8btp\" (UniqueName: \"kubernetes.io/projected/ccd8485f-5d27-4b5c-8155-58281c06943e-kube-api-access-t8btp\") pod \"ingress-canary-z8ffq\" (UID: \"ccd8485f-5d27-4b5c-8155-58281c06943e\") " pod="openshift-ingress-canary/ingress-canary-z8ffq" Apr 24 21:17:14.550480 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.550450 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8btp\" (UniqueName: \"kubernetes.io/projected/ccd8485f-5d27-4b5c-8155-58281c06943e-kube-api-access-t8btp\") pod \"ingress-canary-z8ffq\" (UID: \"ccd8485f-5d27-4b5c-8155-58281c06943e\") " pod="openshift-ingress-canary/ingress-canary-z8ffq" Apr 24 21:17:14.942433 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.942394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:14.942433 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.942441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:14.942695 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:14.942493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:14.942695 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.942584 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:17:14.942695 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.942628 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:17:14.942695 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.942657 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls podName:22015fa1-7a0e-402e-b16d-c8403c4becda nodeName:}" failed. No retries permitted until 2026-04-24 21:17:15.942641536 +0000 UTC m=+34.543797076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls") pod "dns-default-4jr7n" (UID: "22015fa1-7a0e-402e-b16d-c8403c4becda") : secret "dns-default-metrics-tls" not found Apr 24 21:17:14.942695 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.942671 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:14.942695 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.942684 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7649b9b64b-xms8x: secret "image-registry-tls" not found Apr 24 21:17:14.942898 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.942703 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls podName:a78c9c94-bf81-4cf1-9862-dd3d48a12eba nodeName:}" failed. No retries permitted until 2026-04-24 21:17:15.942684537 +0000 UTC m=+34.543840076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls") pod "insights-runtime-extractor-l2wft" (UID: "a78c9c94-bf81-4cf1-9862-dd3d48a12eba") : secret "insights-runtime-extractor-tls" not found Apr 24 21:17:14.942898 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:14.942736 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls podName:66ff2b4e-2b04-4b4e-9761-f1430f4296de nodeName:}" failed. No retries permitted until 2026-04-24 21:17:15.942724793 +0000 UTC m=+34.543880341 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls") pod "image-registry-7649b9b64b-xms8x" (UID: "66ff2b4e-2b04-4b4e-9761-f1430f4296de") : secret "image-registry-tls" not found Apr 24 21:17:15.043502 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:15.043462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert\") pod \"ingress-canary-z8ffq\" (UID: \"ccd8485f-5d27-4b5c-8155-58281c06943e\") " pod="openshift-ingress-canary/ingress-canary-z8ffq" Apr 24 21:17:15.043904 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.043616 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:15.043904 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.043692 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert podName:ccd8485f-5d27-4b5c-8155-58281c06943e nodeName:}" failed. No retries permitted until 2026-04-24 21:17:16.043676643 +0000 UTC m=+34.644832181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert") pod "ingress-canary-z8ffq" (UID: "ccd8485f-5d27-4b5c-8155-58281c06943e") : secret "canary-serving-cert" not found Apr 24 21:17:15.649520 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:15.649478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:15.649710 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.649592 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:15.649710 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.649646 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs podName:f3b8d86d-c179-4368-b025-0c7f41b2aa3e nodeName:}" failed. No retries permitted until 2026-04-24 21:17:47.649632192 +0000 UTC m=+66.250787736 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs") pod "network-metrics-daemon-qm8hh" (UID: "f3b8d86d-c179-4368-b025-0c7f41b2aa3e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:17:15.750341 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:15.750298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2sr2\" (UniqueName: \"kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2\") pod \"network-check-target-qfs2f\" (UID: \"e42a906f-a474-4d11-8a0d-e8ef290b2e14\") " pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:15.750507 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.750453 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:17:15.750507 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.750474 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:17:15.750507 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.750483 2575 projected.go:194] Error preparing data for projected volume kube-api-access-b2sr2 for pod openshift-network-diagnostics/network-check-target-qfs2f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:15.750612 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.750536 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2 podName:e42a906f-a474-4d11-8a0d-e8ef290b2e14 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:47.75052046 +0000 UTC m=+66.351676010 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-b2sr2" (UniqueName: "kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2") pod "network-check-target-qfs2f" (UID: "e42a906f-a474-4d11-8a0d-e8ef290b2e14") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:17:15.951529 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:15.951446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:15.951529 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:15.951494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:15.951712 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:15.951573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:15.951712 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.951669 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:17:15.951712 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.951677 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:15.951712 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.951694 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7649b9b64b-xms8x: secret "image-registry-tls" not found Apr 24 21:17:15.951823 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.951723 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls podName:22015fa1-7a0e-402e-b16d-c8403c4becda nodeName:}" failed. No retries permitted until 2026-04-24 21:17:17.951708725 +0000 UTC m=+36.552864265 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls") pod "dns-default-4jr7n" (UID: "22015fa1-7a0e-402e-b16d-c8403c4becda") : secret "dns-default-metrics-tls" not found Apr 24 21:17:15.951823 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.951737 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls podName:66ff2b4e-2b04-4b4e-9761-f1430f4296de nodeName:}" failed. No retries permitted until 2026-04-24 21:17:17.951730549 +0000 UTC m=+36.552886088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls") pod "image-registry-7649b9b64b-xms8x" (UID: "66ff2b4e-2b04-4b4e-9761-f1430f4296de") : secret "image-registry-tls" not found Apr 24 21:17:15.951823 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.951669 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:17:15.951823 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:15.951758 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls podName:a78c9c94-bf81-4cf1-9862-dd3d48a12eba nodeName:}" failed. No retries permitted until 2026-04-24 21:17:17.951753519 +0000 UTC m=+36.552909057 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls") pod "insights-runtime-extractor-l2wft" (UID: "a78c9c94-bf81-4cf1-9862-dd3d48a12eba") : secret "insights-runtime-extractor-tls" not found Apr 24 21:17:16.002173 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:16.002141 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:16.002173 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:16.002166 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:16.002362 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:16.002141 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:16.005688 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:16.005659 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6pwcs\"" Apr 24 21:17:16.005688 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:16.005679 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gjqgx\"" Apr 24 21:17:16.006142 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:16.005730 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:17:16.006142 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:16.005732 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:17:16.006142 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:16.005820 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:17:16.006142 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:16.006021 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:17:16.052251 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:16.052220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert\") pod \"ingress-canary-z8ffq\" (UID: \"ccd8485f-5d27-4b5c-8155-58281c06943e\") " pod="openshift-ingress-canary/ingress-canary-z8ffq" Apr 24 21:17:16.052619 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:16.052376 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:16.052619 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:16.052456 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert podName:ccd8485f-5d27-4b5c-8155-58281c06943e nodeName:}" failed. No retries permitted until 2026-04-24 21:17:18.052440308 +0000 UTC m=+36.653595847 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert") pod "ingress-canary-z8ffq" (UID: "ccd8485f-5d27-4b5c-8155-58281c06943e") : secret "canary-serving-cert" not found Apr 24 21:17:16.182896 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:16.182868 2575 generic.go:358] "Generic (PLEG): container finished" podID="5b89c06b-ff11-4cc0-bd26-7f792a0f1702" containerID="cca0a902782fd2f39b7f174c75e5621fa7fbe40f4a2f1c1406253d2aa38830b0" exitCode=0 Apr 24 21:17:16.183058 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:16.182906 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8829" event={"ID":"5b89c06b-ff11-4cc0-bd26-7f792a0f1702","Type":"ContainerDied","Data":"cca0a902782fd2f39b7f174c75e5621fa7fbe40f4a2f1c1406253d2aa38830b0"} Apr 24 21:17:17.186979 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:17.186940 2575 generic.go:358] "Generic (PLEG): container finished" podID="5b89c06b-ff11-4cc0-bd26-7f792a0f1702" containerID="a7b87bf69bf149da0bceb7930d78df70426003f13f6a09bbe1c02913000ed5c0" exitCode=0 Apr 24 21:17:17.187488 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:17.187006 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8829" event={"ID":"5b89c06b-ff11-4cc0-bd26-7f792a0f1702","Type":"ContainerDied","Data":"a7b87bf69bf149da0bceb7930d78df70426003f13f6a09bbe1c02913000ed5c0"} Apr 24 21:17:17.968108 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:17.968043 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:17.968264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:17.968176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:17.968264 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:17.968196 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:17:17.968264 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:17.968213 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7649b9b64b-xms8x: secret "image-registry-tls" not found Apr 24 21:17:17.968264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:17.968215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:17.968389 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:17.968332 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:17:17.968389 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:17.968335 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:17:17.968389 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:17.968354 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls podName:66ff2b4e-2b04-4b4e-9761-f1430f4296de nodeName:}" failed. No retries permitted until 2026-04-24 21:17:21.968333518 +0000 UTC m=+40.569489075 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls") pod "image-registry-7649b9b64b-xms8x" (UID: "66ff2b4e-2b04-4b4e-9761-f1430f4296de") : secret "image-registry-tls" not found Apr 24 21:17:17.968389 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:17.968373 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls podName:a78c9c94-bf81-4cf1-9862-dd3d48a12eba nodeName:}" failed. No retries permitted until 2026-04-24 21:17:21.96836401 +0000 UTC m=+40.569519549 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls") pod "insights-runtime-extractor-l2wft" (UID: "a78c9c94-bf81-4cf1-9862-dd3d48a12eba") : secret "insights-runtime-extractor-tls" not found Apr 24 21:17:17.968389 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:17.968384 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls podName:22015fa1-7a0e-402e-b16d-c8403c4becda nodeName:}" failed. No retries permitted until 2026-04-24 21:17:21.968379695 +0000 UTC m=+40.569535233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls") pod "dns-default-4jr7n" (UID: "22015fa1-7a0e-402e-b16d-c8403c4becda") : secret "dns-default-metrics-tls" not found Apr 24 21:17:18.068752 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:18.068720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert\") pod \"ingress-canary-z8ffq\" (UID: \"ccd8485f-5d27-4b5c-8155-58281c06943e\") " pod="openshift-ingress-canary/ingress-canary-z8ffq" Apr 24 21:17:18.068887 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:18.068869 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:17:18.068941 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:18.068930 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert podName:ccd8485f-5d27-4b5c-8155-58281c06943e nodeName:}" failed. No retries permitted until 2026-04-24 21:17:22.068914331 +0000 UTC m=+40.670069875 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert") pod "ingress-canary-z8ffq" (UID: "ccd8485f-5d27-4b5c-8155-58281c06943e") : secret "canary-serving-cert" not found Apr 24 21:17:18.191551 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:18.191478 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r8829" event={"ID":"5b89c06b-ff11-4cc0-bd26-7f792a0f1702","Type":"ContainerStarted","Data":"d8d70987fed10c31b096c0fde02a1852d3e22dd3ab394450e9cb87cada132ffa"} Apr 24 21:17:18.220602 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:18.220556 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-r8829" podStartSLOduration=5.623797373 podStartE2EDuration="36.220542713s" podCreationTimestamp="2026-04-24 21:16:42 +0000 UTC" firstStartedPulling="2026-04-24 21:16:44.536051854 +0000 UTC m=+3.137207396" lastFinishedPulling="2026-04-24 21:17:15.132797182 +0000 UTC m=+33.733952736" observedRunningTime="2026-04-24 21:17:18.220204436 +0000 UTC m=+36.821360016" watchObservedRunningTime="2026-04-24 21:17:18.220542713 +0000 UTC m=+36.821698270" Apr 24 21:17:22.000978 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.000947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:22.001481 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.001011 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:22.001481 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.001034 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:22.004536 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.004505 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a78c9c94-bf81-4cf1-9862-dd3d48a12eba-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2wft\" (UID: \"a78c9c94-bf81-4cf1-9862-dd3d48a12eba\") " pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:22.004536 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.004505 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22015fa1-7a0e-402e-b16d-c8403c4becda-metrics-tls\") pod \"dns-default-4jr7n\" (UID: \"22015fa1-7a0e-402e-b16d-c8403c4becda\") " pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:22.004691 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.004565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls\") pod \"image-registry-7649b9b64b-xms8x\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:22.033915 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.033865 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:22.048749 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.048721 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:22.071633 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.071549 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l2wft" Apr 24 21:17:22.101876 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.101748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert\") pod \"ingress-canary-z8ffq\" (UID: \"ccd8485f-5d27-4b5c-8155-58281c06943e\") " pod="openshift-ingress-canary/ingress-canary-z8ffq" Apr 24 21:17:22.106278 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.106244 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccd8485f-5d27-4b5c-8155-58281c06943e-cert\") pod \"ingress-canary-z8ffq\" (UID: \"ccd8485f-5d27-4b5c-8155-58281c06943e\") " pod="openshift-ingress-canary/ingress-canary-z8ffq" Apr 24 21:17:22.186744 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.186682 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7649b9b64b-xms8x"] Apr 24 21:17:22.203849 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.203523 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4jr7n"] Apr 24 21:17:22.211787 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:22.211740 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66ff2b4e_2b04_4b4e_9761_f1430f4296de.slice/crio-5b6f7cfbeb46184ece9082dbc015ada1598344c8e9e5c6d62d98960b7c8314b3 WatchSource:0}: Error finding container 5b6f7cfbeb46184ece9082dbc015ada1598344c8e9e5c6d62d98960b7c8314b3: Status 404 returned error can't find the container with id 5b6f7cfbeb46184ece9082dbc015ada1598344c8e9e5c6d62d98960b7c8314b3 Apr 24 21:17:22.216762 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:22.216737 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22015fa1_7a0e_402e_b16d_c8403c4becda.slice/crio-4e6cd7fdea0d8727ebe84781eb5037293c418deac8cc6a2340e3cd075bc5bef4 WatchSource:0}: Error finding container 4e6cd7fdea0d8727ebe84781eb5037293c418deac8cc6a2340e3cd075bc5bef4: Status 404 returned error can't find the container with id 4e6cd7fdea0d8727ebe84781eb5037293c418deac8cc6a2340e3cd075bc5bef4 Apr 24 21:17:22.228712 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.228688 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l2wft"] Apr 24 21:17:22.232474 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:22.232445 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda78c9c94_bf81_4cf1_9862_dd3d48a12eba.slice/crio-671f5100616ba292633ecb588cd05201f227b13e6b268efc51126b71a693ae36 WatchSource:0}: Error finding container 671f5100616ba292633ecb588cd05201f227b13e6b268efc51126b71a693ae36: Status 404 returned error can't find the container with id 671f5100616ba292633ecb588cd05201f227b13e6b268efc51126b71a693ae36 Apr 24 21:17:22.386778 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.386741 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z8ffq" Apr 24 21:17:22.507257 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:22.507224 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z8ffq"] Apr 24 21:17:22.510800 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:22.510774 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccd8485f_5d27_4b5c_8155_58281c06943e.slice/crio-48f891deb9c8eb940c0c5aa0a744e4664a09dfb866c14860191ae2aaaa11df8b WatchSource:0}: Error finding container 48f891deb9c8eb940c0c5aa0a744e4664a09dfb866c14860191ae2aaaa11df8b: Status 404 returned error can't find the container with id 48f891deb9c8eb940c0c5aa0a744e4664a09dfb866c14860191ae2aaaa11df8b Apr 24 21:17:23.213929 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:23.213839 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l2wft" event={"ID":"a78c9c94-bf81-4cf1-9862-dd3d48a12eba","Type":"ContainerStarted","Data":"6d20bb7011e2207b5e23d6d77877b56442188f58fb23fdbe6bfc4145029b5b8c"} Apr 24 21:17:23.213929 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:23.213882 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l2wft" event={"ID":"a78c9c94-bf81-4cf1-9862-dd3d48a12eba","Type":"ContainerStarted","Data":"eb14d4535f33389b202a446c410cbb6d088b64d78975a0e7c7e968f88e7a7362"} Apr 24 21:17:23.213929 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:23.213895 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l2wft" event={"ID":"a78c9c94-bf81-4cf1-9862-dd3d48a12eba","Type":"ContainerStarted","Data":"671f5100616ba292633ecb588cd05201f227b13e6b268efc51126b71a693ae36"} Apr 24 21:17:23.215445 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:23.215415 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4jr7n" event={"ID":"22015fa1-7a0e-402e-b16d-c8403c4becda","Type":"ContainerStarted","Data":"4e6cd7fdea0d8727ebe84781eb5037293c418deac8cc6a2340e3cd075bc5bef4"} Apr 24 21:17:23.217253 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:23.217192 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" event={"ID":"66ff2b4e-2b04-4b4e-9761-f1430f4296de","Type":"ContainerStarted","Data":"74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6"} Apr 24 21:17:23.217253 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:23.217227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" event={"ID":"66ff2b4e-2b04-4b4e-9761-f1430f4296de","Type":"ContainerStarted","Data":"5b6f7cfbeb46184ece9082dbc015ada1598344c8e9e5c6d62d98960b7c8314b3"} Apr 24 21:17:23.217412 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:23.217270 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:23.218956 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:23.218927 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z8ffq" event={"ID":"ccd8485f-5d27-4b5c-8155-58281c06943e","Type":"ContainerStarted","Data":"48f891deb9c8eb940c0c5aa0a744e4664a09dfb866c14860191ae2aaaa11df8b"} Apr 24 21:17:23.240344 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:23.240286 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" podStartSLOduration=48.240266343 podStartE2EDuration="48.240266343s" podCreationTimestamp="2026-04-24 21:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:23.239083284 +0000 UTC m=+41.840238842" watchObservedRunningTime="2026-04-24 21:17:23.240266343 +0000 UTC m=+41.841421905" Apr 24 21:17:24.225978 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:24.225895 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4jr7n" event={"ID":"22015fa1-7a0e-402e-b16d-c8403c4becda","Type":"ContainerStarted","Data":"4914dc0557d508d8c9e5dda0a206762e0f538341dd7205c32a21cc066fbf0401"} Apr 24 21:17:24.225978 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:24.225938 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4jr7n" event={"ID":"22015fa1-7a0e-402e-b16d-c8403c4becda","Type":"ContainerStarted","Data":"e11ef5454b7f798299c8e86721908028052d6a65c1f14d2b1e0b019096bcdc31"} Apr 24 21:17:24.226503 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:24.226279 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:24.242558 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:24.242500 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4jr7n" podStartSLOduration=8.833690663 podStartE2EDuration="10.242480485s" podCreationTimestamp="2026-04-24 21:17:14 +0000 UTC" firstStartedPulling="2026-04-24 21:17:22.218429066 +0000 UTC m=+40.819584605" lastFinishedPulling="2026-04-24 21:17:23.627218875 +0000 UTC m=+42.228374427" observedRunningTime="2026-04-24 21:17:24.241655535 +0000 UTC m=+42.842811096" watchObservedRunningTime="2026-04-24 21:17:24.242480485 +0000 UTC m=+42.843636049" Apr 24 21:17:24.523860 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:24.523758 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:24.526484 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:24.526450 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a-original-pull-secret\") pod \"global-pull-secret-syncer-fvzdq\" (UID: \"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a\") " pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:24.717459 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:24.717427 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvzdq" Apr 24 21:17:24.856940 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:24.856909 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fvzdq"] Apr 24 21:17:24.860630 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:24.860585 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod527f1b83_b8e1_4bd9_bb12_bbf7a6ab672a.slice/crio-e538d89e2a9bc4e22f36d9e9e59106d18598c30869dfd40886da81f1bbfec7aa WatchSource:0}: Error finding container e538d89e2a9bc4e22f36d9e9e59106d18598c30869dfd40886da81f1bbfec7aa: Status 404 returned error can't find the container with id e538d89e2a9bc4e22f36d9e9e59106d18598c30869dfd40886da81f1bbfec7aa Apr 24 21:17:25.229601 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:25.229563 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z8ffq" event={"ID":"ccd8485f-5d27-4b5c-8155-58281c06943e","Type":"ContainerStarted","Data":"6eb79f535aa1779c21ce6ef7051af04a0f3235fe3e0a0bf0de84bba297503608"} Apr 24 21:17:25.230764 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:25.230739 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fvzdq" event={"ID":"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a","Type":"ContainerStarted","Data":"e538d89e2a9bc4e22f36d9e9e59106d18598c30869dfd40886da81f1bbfec7aa"} Apr 24 21:17:25.232677 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:25.232652 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l2wft" event={"ID":"a78c9c94-bf81-4cf1-9862-dd3d48a12eba","Type":"ContainerStarted","Data":"21ca37f22f95220273af095f75ebbdd1e76606c95fb99340a939a205bbc59d12"} Apr 24 21:17:25.248372 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:25.248325 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-z8ffq" podStartSLOduration=9.015189036 podStartE2EDuration="11.248311589s" podCreationTimestamp="2026-04-24 21:17:14 +0000 UTC" firstStartedPulling="2026-04-24 21:17:22.512792203 +0000 UTC m=+41.113947741" lastFinishedPulling="2026-04-24 21:17:24.745914742 +0000 UTC m=+43.347070294" observedRunningTime="2026-04-24 21:17:25.247732666 +0000 UTC m=+43.848888229" watchObservedRunningTime="2026-04-24 21:17:25.248311589 +0000 UTC m=+43.849467150" Apr 24 21:17:25.269472 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:25.269418 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-l2wft" podStartSLOduration=8.825292869 podStartE2EDuration="11.269405051s" podCreationTimestamp="2026-04-24 21:17:14 +0000 UTC" firstStartedPulling="2026-04-24 21:17:22.301273892 +0000 UTC m=+40.902429430" lastFinishedPulling="2026-04-24 21:17:24.74538607 +0000 UTC m=+43.346541612" observedRunningTime="2026-04-24 21:17:25.268412049 +0000 UTC m=+43.869567591" watchObservedRunningTime="2026-04-24 21:17:25.269405051 +0000 UTC m=+43.870560611" Apr 24 21:17:30.250218 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:30.250178 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fvzdq" event={"ID":"527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a","Type":"ContainerStarted","Data":"292f69d2b4e1fe88e01edb74dd82852773413bb88ddb1270af6fb290f6aba39d"} Apr 24 21:17:30.269186 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:30.269126 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fvzdq" podStartSLOduration=34.017186632 podStartE2EDuration="38.26910797s" podCreationTimestamp="2026-04-24 21:16:52 +0000 UTC" firstStartedPulling="2026-04-24 21:17:24.862376553 +0000 UTC m=+43.463532093" lastFinishedPulling="2026-04-24 21:17:29.114297889 +0000 UTC m=+47.715453431" observedRunningTime="2026-04-24 21:17:30.2684649 +0000 UTC m=+48.869620458" watchObservedRunningTime="2026-04-24 21:17:30.26910797 +0000 UTC m=+48.870263532" Apr 24 21:17:34.235411 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:34.235381 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4jr7n" Apr 24 21:17:37.634313 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.634282 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7649b9b64b-xms8x"] Apr 24 21:17:37.638292 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.638260 2575 patch_prober.go:28] interesting pod/image-registry-7649b9b64b-xms8x container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 21:17:37.638447 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.638310 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" podUID="66ff2b4e-2b04-4b4e-9761-f1430f4296de" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:17:37.670501 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.670464 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-jpt2j"] Apr 24 21:17:37.673203 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.673183 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-jpt2j" Apr 24 21:17:37.678501 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.678475 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:17:37.679005 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.678989 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-t6gwz\"" Apr 24 21:17:37.679441 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.679329 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:17:37.705288 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.705252 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-jpt2j"] Apr 24 21:17:37.716996 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.716971 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbgcw\" (UniqueName: \"kubernetes.io/projected/3aed0d1c-5d90-4a85-912d-d13220c855e2-kube-api-access-tbgcw\") pod \"downloads-6bcc868b7-jpt2j\" (UID: \"3aed0d1c-5d90-4a85-912d-d13220c855e2\") " pod="openshift-console/downloads-6bcc868b7-jpt2j" Apr 24 21:17:37.768705 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.768668 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-74b659f8d6-f49ft"] Apr 24 21:17:37.770453 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.770438 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g"] Apr 24 21:17:37.770604 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.770585 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.772497 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.772462 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g" Apr 24 21:17:37.775311 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.775291 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-9h7km\"" Apr 24 21:17:37.775419 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.775396 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 21:17:37.819411 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.819282 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/07b1e968-a375-49e9-81d4-c36509044321-registry-tls\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.819411 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.819361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5gdj\" (UniqueName: \"kubernetes.io/projected/07b1e968-a375-49e9-81d4-c36509044321-kube-api-access-t5gdj\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.819630 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.819477 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/07b1e968-a375-49e9-81d4-c36509044321-ca-trust-extracted\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.819630 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.819573 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07b1e968-a375-49e9-81d4-c36509044321-bound-sa-token\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.819630 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.819621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/07b1e968-a375-49e9-81d4-c36509044321-image-registry-private-configuration\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.819749 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.819655 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/07b1e968-a375-49e9-81d4-c36509044321-registry-certificates\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.819800 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.819762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbgcw\" (UniqueName: \"kubernetes.io/projected/3aed0d1c-5d90-4a85-912d-d13220c855e2-kube-api-access-tbgcw\") pod \"downloads-6bcc868b7-jpt2j\" (UID: \"3aed0d1c-5d90-4a85-912d-d13220c855e2\") " pod="openshift-console/downloads-6bcc868b7-jpt2j" Apr 24 21:17:37.820128 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.820106 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/15cf2e85-2ffe-4348-8ba9-0a843b7ba7e6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9sb4g\" (UID: \"15cf2e85-2ffe-4348-8ba9-0a843b7ba7e6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g" Apr 24 21:17:37.820202 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.820145 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07b1e968-a375-49e9-81d4-c36509044321-trusted-ca\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.820202 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.820182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/07b1e968-a375-49e9-81d4-c36509044321-installation-pull-secrets\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.837725 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.837694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbgcw\" (UniqueName: \"kubernetes.io/projected/3aed0d1c-5d90-4a85-912d-d13220c855e2-kube-api-access-tbgcw\") pod \"downloads-6bcc868b7-jpt2j\" (UID: \"3aed0d1c-5d90-4a85-912d-d13220c855e2\") " pod="openshift-console/downloads-6bcc868b7-jpt2j" Apr 24 21:17:37.872763 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.872728 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-74b659f8d6-f49ft"] Apr 24 21:17:37.920984 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.920900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/15cf2e85-2ffe-4348-8ba9-0a843b7ba7e6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9sb4g\" (UID: \"15cf2e85-2ffe-4348-8ba9-0a843b7ba7e6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g" Apr 24 21:17:37.920984 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.920936 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07b1e968-a375-49e9-81d4-c36509044321-trusted-ca\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.921455 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.921088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/07b1e968-a375-49e9-81d4-c36509044321-installation-pull-secrets\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.921455 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.921146 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/07b1e968-a375-49e9-81d4-c36509044321-registry-tls\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.921455 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.921173 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5gdj\" (UniqueName: \"kubernetes.io/projected/07b1e968-a375-49e9-81d4-c36509044321-kube-api-access-t5gdj\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.921455 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.921200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/07b1e968-a375-49e9-81d4-c36509044321-ca-trust-extracted\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.921455 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.921334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07b1e968-a375-49e9-81d4-c36509044321-bound-sa-token\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.921455 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.921382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/07b1e968-a375-49e9-81d4-c36509044321-image-registry-private-configuration\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.921455 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.921413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/07b1e968-a375-49e9-81d4-c36509044321-registry-certificates\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.921897 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.921844 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g"] Apr 24 21:17:37.922028 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.922002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07b1e968-a375-49e9-81d4-c36509044321-trusted-ca\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.922718 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.922227 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/07b1e968-a375-49e9-81d4-c36509044321-registry-certificates\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.922718 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.922510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/07b1e968-a375-49e9-81d4-c36509044321-ca-trust-extracted\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.924191 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.924146 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/15cf2e85-2ffe-4348-8ba9-0a843b7ba7e6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9sb4g\" (UID: \"15cf2e85-2ffe-4348-8ba9-0a843b7ba7e6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g" Apr 24 21:17:37.924502 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.924478 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/07b1e968-a375-49e9-81d4-c36509044321-installation-pull-secrets\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.924502 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.924494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/07b1e968-a375-49e9-81d4-c36509044321-image-registry-private-configuration\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.924804 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.924786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/07b1e968-a375-49e9-81d4-c36509044321-registry-tls\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.981607 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.981581 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-jpt2j" Apr 24 21:17:37.981953 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.981931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5gdj\" (UniqueName: \"kubernetes.io/projected/07b1e968-a375-49e9-81d4-c36509044321-kube-api-access-t5gdj\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:37.992126 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:37.992102 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07b1e968-a375-49e9-81d4-c36509044321-bound-sa-token\") pod \"image-registry-74b659f8d6-f49ft\" (UID: \"07b1e968-a375-49e9-81d4-c36509044321\") " pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:38.080370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:38.080332 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:38.085144 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:38.085120 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g" Apr 24 21:17:38.140357 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:38.139982 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-jpt2j"] Apr 24 21:17:38.150289 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:38.150191 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aed0d1c_5d90_4a85_912d_d13220c855e2.slice/crio-40627a48e0f2600ef2048aa6358a10b603e5127510b633ff2a44a07aa5d7e240 WatchSource:0}: Error finding container 40627a48e0f2600ef2048aa6358a10b603e5127510b633ff2a44a07aa5d7e240: Status 404 returned error can't find the container with id 40627a48e0f2600ef2048aa6358a10b603e5127510b633ff2a44a07aa5d7e240 Apr 24 21:17:38.250040 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:38.250010 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-74b659f8d6-f49ft"] Apr 24 21:17:38.250655 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:38.250628 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g"] Apr 24 21:17:38.251158 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:38.251130 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15cf2e85_2ffe_4348_8ba9_0a843b7ba7e6.slice/crio-d28302720f2d7bf45f36c18d11d13c56926217424e7227c7e940ed8aba558db6 WatchSource:0}: Error finding container d28302720f2d7bf45f36c18d11d13c56926217424e7227c7e940ed8aba558db6: Status 404 returned error can't find the container with id d28302720f2d7bf45f36c18d11d13c56926217424e7227c7e940ed8aba558db6 Apr 24 21:17:38.251728 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:38.251701 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07b1e968_a375_49e9_81d4_c36509044321.slice/crio-0ab209076aae20a6a119f79db9b488b1d9d921d41d1e50aa3c4ee1292fd74500 WatchSource:0}: Error finding container 0ab209076aae20a6a119f79db9b488b1d9d921d41d1e50aa3c4ee1292fd74500: Status 404 returned error can't find the container with id 0ab209076aae20a6a119f79db9b488b1d9d921d41d1e50aa3c4ee1292fd74500 Apr 24 21:17:38.270221 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:38.270190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" event={"ID":"07b1e968-a375-49e9-81d4-c36509044321","Type":"ContainerStarted","Data":"0ab209076aae20a6a119f79db9b488b1d9d921d41d1e50aa3c4ee1292fd74500"} Apr 24 21:17:38.271284 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:38.271253 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g" event={"ID":"15cf2e85-2ffe-4348-8ba9-0a843b7ba7e6","Type":"ContainerStarted","Data":"d28302720f2d7bf45f36c18d11d13c56926217424e7227c7e940ed8aba558db6"} Apr 24 21:17:38.272437 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:38.272411 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-jpt2j" event={"ID":"3aed0d1c-5d90-4a85-912d-d13220c855e2","Type":"ContainerStarted","Data":"40627a48e0f2600ef2048aa6358a10b603e5127510b633ff2a44a07aa5d7e240"} Apr 24 21:17:39.277204 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:39.277167 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" event={"ID":"07b1e968-a375-49e9-81d4-c36509044321","Type":"ContainerStarted","Data":"33a2882e1927c796b65a96d6e2508312febc909e5c54f8bfdd6c94f46c7de0c4"} Apr 24 21:17:39.277572 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:39.277291 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:17:39.334440 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:39.334395 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" podStartSLOduration=2.334380483 podStartE2EDuration="2.334380483s" podCreationTimestamp="2026-04-24 21:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:39.332159904 +0000 UTC m=+57.933315465" watchObservedRunningTime="2026-04-24 21:17:39.334380483 +0000 UTC m=+57.935536022" Apr 24 21:17:40.181439 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:40.181411 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9z5w" Apr 24 21:17:40.281652 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:40.281596 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g" event={"ID":"15cf2e85-2ffe-4348-8ba9-0a843b7ba7e6","Type":"ContainerStarted","Data":"63e149e7d4d92b4e5585ff93d90b2854f36a6169689bf560a78f3ca9802f9c00"} Apr 24 21:17:40.282144 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:40.281953 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g" Apr 24 21:17:40.288084 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:40.288036 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g" Apr 24 21:17:40.331126 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:40.330233 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9sb4g" podStartSLOduration=2.308232775 podStartE2EDuration="3.330214897s" podCreationTimestamp="2026-04-24 21:17:37 +0000 UTC" firstStartedPulling="2026-04-24 21:17:38.25318984 +0000 UTC m=+56.854345379" lastFinishedPulling="2026-04-24 21:17:39.275171947 +0000 UTC m=+57.876327501" observedRunningTime="2026-04-24 21:17:40.329815837 +0000 UTC m=+58.930971389" watchObservedRunningTime="2026-04-24 21:17:40.330214897 +0000 UTC m=+58.931370459" Apr 24 21:17:41.199365 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.199326 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-7j6fz"] Apr 24 21:17:41.201625 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.201605 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.207360 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.207332 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-tzb4d\"" Apr 24 21:17:41.207360 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.207333 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 21:17:41.207573 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.207342 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:17:41.207803 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.207774 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 21:17:41.207926 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.207841 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:17:41.207926 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.207860 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:17:41.229289 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.229250 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-7j6fz"] Apr 24 21:17:41.249429 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.249392 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/871d248c-2080-42f6-b548-3ed0e9f28ff0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.249608 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.249450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/871d248c-2080-42f6-b548-3ed0e9f28ff0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.249608 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.249555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwcxt\" (UniqueName: \"kubernetes.io/projected/871d248c-2080-42f6-b548-3ed0e9f28ff0-kube-api-access-fwcxt\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.249725 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.249612 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/871d248c-2080-42f6-b548-3ed0e9f28ff0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.350534 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.350490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/871d248c-2080-42f6-b548-3ed0e9f28ff0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.351009 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.350544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwcxt\" (UniqueName: \"kubernetes.io/projected/871d248c-2080-42f6-b548-3ed0e9f28ff0-kube-api-access-fwcxt\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.351009 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.350585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/871d248c-2080-42f6-b548-3ed0e9f28ff0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.351009 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.350672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/871d248c-2080-42f6-b548-3ed0e9f28ff0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.351009 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:41.350777 2575 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 24 21:17:41.351009 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:41.350849 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/871d248c-2080-42f6-b548-3ed0e9f28ff0-prometheus-operator-tls podName:871d248c-2080-42f6-b548-3ed0e9f28ff0 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:41.850827691 +0000 UTC m=+60.451983234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/871d248c-2080-42f6-b548-3ed0e9f28ff0-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-7j6fz" (UID: "871d248c-2080-42f6-b548-3ed0e9f28ff0") : secret "prometheus-operator-tls" not found Apr 24 21:17:41.351266 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.351251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/871d248c-2080-42f6-b548-3ed0e9f28ff0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.352874 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.352851 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/871d248c-2080-42f6-b548-3ed0e9f28ff0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.385692 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.385652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwcxt\" (UniqueName: \"kubernetes.io/projected/871d248c-2080-42f6-b548-3ed0e9f28ff0-kube-api-access-fwcxt\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.856059 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.856026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/871d248c-2080-42f6-b548-3ed0e9f28ff0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:41.858682 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:41.858653 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/871d248c-2080-42f6-b548-3ed0e9f28ff0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-7j6fz\" (UID: \"871d248c-2080-42f6-b548-3ed0e9f28ff0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:42.117546 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:42.117472 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-tzb4d\"" Apr 24 21:17:42.123698 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:42.123665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" Apr 24 21:17:42.266932 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:42.266897 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-7j6fz"] Apr 24 21:17:42.271085 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:42.271048 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod871d248c_2080_42f6_b548_3ed0e9f28ff0.slice/crio-fd8687ce92e76087076d774b8004d7273346a8c9c5f622b9bc69a36ff4dca418 WatchSource:0}: Error finding container fd8687ce92e76087076d774b8004d7273346a8c9c5f622b9bc69a36ff4dca418: Status 404 returned error can't find the container with id fd8687ce92e76087076d774b8004d7273346a8c9c5f622b9bc69a36ff4dca418 Apr 24 21:17:42.288598 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:42.288566 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" event={"ID":"871d248c-2080-42f6-b548-3ed0e9f28ff0","Type":"ContainerStarted","Data":"fd8687ce92e76087076d774b8004d7273346a8c9c5f622b9bc69a36ff4dca418"} Apr 24 21:17:43.285847 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.285027 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9b6d99dc7-2lxqr"] Apr 24 21:17:43.287297 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.287267 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.290768 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.290736 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:17:43.291039 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.291012 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:17:43.291802 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.291781 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sxvx6\"" Apr 24 21:17:43.291901 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.291872 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:17:43.292275 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.292105 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:17:43.292275 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.292207 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:17:43.329744 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.329711 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9b6d99dc7-2lxqr"] Apr 24 21:17:43.367961 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.367915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-oauth-serving-cert\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.368149 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.368111 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-oauth-config\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.368149 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.368144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-serving-cert\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.368264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.368169 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh859\" (UniqueName: \"kubernetes.io/projected/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-kube-api-access-jh859\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.368264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.368194 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-service-ca\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.368264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.368247 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-config\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.469073 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.469023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-oauth-config\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.469171 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.469084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-serving-cert\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.469171 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.469109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jh859\" (UniqueName: \"kubernetes.io/projected/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-kube-api-access-jh859\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.469171 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.469133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-service-ca\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.469171 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.469165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-config\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.469368 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.469220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-oauth-serving-cert\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.470004 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.469969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-oauth-serving-cert\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.470126 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.470014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-service-ca\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.470126 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.470050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-config\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.471936 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.471911 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-oauth-config\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.472650 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.472626 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-serving-cert\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.534250 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.534216 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh859\" (UniqueName: \"kubernetes.io/projected/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-kube-api-access-jh859\") pod \"console-9b6d99dc7-2lxqr\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.599748 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.599677 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:43.773219 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:43.773178 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9b6d99dc7-2lxqr"] Apr 24 21:17:43.777983 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:43.777943 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1c7be9e_6e02_4d37_9210_a1b659d4bea5.slice/crio-e7e571653bf1ab1f49809ea3960ece0237100ad37e61b954858dcffb48f15899 WatchSource:0}: Error finding container e7e571653bf1ab1f49809ea3960ece0237100ad37e61b954858dcffb48f15899: Status 404 returned error can't find the container with id e7e571653bf1ab1f49809ea3960ece0237100ad37e61b954858dcffb48f15899 Apr 24 21:17:44.298078 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:44.298018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" event={"ID":"871d248c-2080-42f6-b548-3ed0e9f28ff0","Type":"ContainerStarted","Data":"6ecb2ec01efbcbb11bb4b09761c14d6add420b2bac671c222651a4f6be110a5c"} Apr 24 21:17:44.298673 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:44.298082 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" event={"ID":"871d248c-2080-42f6-b548-3ed0e9f28ff0","Type":"ContainerStarted","Data":"b6426041d447c4e97910520318f24d15190ea40970b7010789c14925e878b16b"} Apr 24 21:17:44.299566 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:44.299532 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9b6d99dc7-2lxqr" event={"ID":"c1c7be9e-6e02-4d37-9210-a1b659d4bea5","Type":"ContainerStarted","Data":"e7e571653bf1ab1f49809ea3960ece0237100ad37e61b954858dcffb48f15899"} Apr 24 21:17:44.319934 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:44.319862 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-7j6fz" podStartSLOduration=2.13065954 podStartE2EDuration="3.319841164s" podCreationTimestamp="2026-04-24 21:17:41 +0000 UTC" firstStartedPulling="2026-04-24 21:17:42.273475379 +0000 UTC m=+60.874630924" lastFinishedPulling="2026-04-24 21:17:43.462656997 +0000 UTC m=+62.063812548" observedRunningTime="2026-04-24 21:17:44.319482277 +0000 UTC m=+62.920637838" watchObservedRunningTime="2026-04-24 21:17:44.319841164 +0000 UTC m=+62.920996726" Apr 24 21:17:46.879175 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:46.879142 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4f4tq"] Apr 24 21:17:46.881370 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:46.881347 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:46.883907 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:46.883871 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:17:46.884045 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:46.883871 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fm79n\"" Apr 24 21:17:46.884282 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:46.884263 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:17:46.884507 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:46.884488 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:17:47.001866 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.001787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-wtmp\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.001866 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.001831 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-textfile\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.001866 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.001858 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/721680e7-1c21-4de2-ad20-76f3ca88fdf4-root\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.002142 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.001953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-tls\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.002142 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.002006 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.002142 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.002079 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-accelerators-collector-config\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.002142 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.002114 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxj44\" (UniqueName: \"kubernetes.io/projected/721680e7-1c21-4de2-ad20-76f3ca88fdf4-kube-api-access-lxj44\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.002142 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.002136 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/721680e7-1c21-4de2-ad20-76f3ca88fdf4-metrics-client-ca\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.002396 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.002217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/721680e7-1c21-4de2-ad20-76f3ca88fdf4-sys\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.103358 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103318 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-textfile\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.103549 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/721680e7-1c21-4de2-ad20-76f3ca88fdf4-root\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.103549 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103418 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-tls\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.103549 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103459 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.103549 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/721680e7-1c21-4de2-ad20-76f3ca88fdf4-root\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.103549 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103503 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-accelerators-collector-config\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.103802 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxj44\" (UniqueName: \"kubernetes.io/projected/721680e7-1c21-4de2-ad20-76f3ca88fdf4-kube-api-access-lxj44\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.103802 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:47.103623 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:17:47.103802 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/721680e7-1c21-4de2-ad20-76f3ca88fdf4-metrics-client-ca\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.103802 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:17:47.103697 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-tls podName:721680e7-1c21-4de2-ad20-76f3ca88fdf4 nodeName:}" failed. No retries permitted until 2026-04-24 21:17:47.603680002 +0000 UTC m=+66.204835546 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-tls") pod "node-exporter-4f4tq" (UID: "721680e7-1c21-4de2-ad20-76f3ca88fdf4") : secret "node-exporter-tls" not found Apr 24 21:17:47.103802 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/721680e7-1c21-4de2-ad20-76f3ca88fdf4-sys\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.103802 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103748 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-textfile\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.103802 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-wtmp\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.104152 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-wtmp\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.104152 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.103890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/721680e7-1c21-4de2-ad20-76f3ca88fdf4-sys\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.104337 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.104314 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/721680e7-1c21-4de2-ad20-76f3ca88fdf4-metrics-client-ca\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.104420 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.104314 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-accelerators-collector-config\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.106463 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.106436 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.116623 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.116587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxj44\" (UniqueName: \"kubernetes.io/projected/721680e7-1c21-4de2-ad20-76f3ca88fdf4-kube-api-access-lxj44\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.308723 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.308682 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9b6d99dc7-2lxqr" event={"ID":"c1c7be9e-6e02-4d37-9210-a1b659d4bea5","Type":"ContainerStarted","Data":"11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43"} Apr 24 21:17:47.342367 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.342307 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9b6d99dc7-2lxqr" podStartSLOduration=1.447245198 podStartE2EDuration="4.342286776s" podCreationTimestamp="2026-04-24 21:17:43 +0000 UTC" firstStartedPulling="2026-04-24 21:17:43.780198736 +0000 UTC m=+62.381354292" lastFinishedPulling="2026-04-24 21:17:46.675240316 +0000 UTC m=+65.276395870" observedRunningTime="2026-04-24 21:17:47.341508247 +0000 UTC m=+65.942663811" watchObservedRunningTime="2026-04-24 21:17:47.342286776 +0000 UTC m=+65.943442338" Apr 24 21:17:47.609253 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.609162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-tls\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.611757 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.611723 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/721680e7-1c21-4de2-ad20-76f3ca88fdf4-node-exporter-tls\") pod \"node-exporter-4f4tq\" (UID: \"721680e7-1c21-4de2-ad20-76f3ca88fdf4\") " pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.640999 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.640966 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:17:47.710014 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.709976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:47.712482 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.712457 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:17:47.722979 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.722914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3b8d86d-c179-4368-b025-0c7f41b2aa3e-metrics-certs\") pod \"network-metrics-daemon-qm8hh\" (UID: \"f3b8d86d-c179-4368-b025-0c7f41b2aa3e\") " pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:47.792329 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.792294 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4f4tq" Apr 24 21:17:47.810877 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.810832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2sr2\" (UniqueName: \"kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2\") pod \"network-check-target-qfs2f\" (UID: \"e42a906f-a474-4d11-8a0d-e8ef290b2e14\") " pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:47.813005 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.812977 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:17:47.813605 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.813580 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gjqgx\"" Apr 24 21:17:47.821699 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.821675 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qm8hh" Apr 24 21:17:47.824109 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.823874 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:17:47.834760 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:47.834734 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2sr2\" (UniqueName: \"kubernetes.io/projected/e42a906f-a474-4d11-8a0d-e8ef290b2e14-kube-api-access-b2sr2\") pod \"network-check-target-qfs2f\" (UID: \"e42a906f-a474-4d11-8a0d-e8ef290b2e14\") " pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:48.124772 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:48.124737 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6pwcs\"" Apr 24 21:17:48.133037 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:48.133001 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:53.601117 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:53.600856 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:53.601117 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:53.601028 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:53.607107 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:53.607083 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:54.223042 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:54.222994 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod721680e7_1c21_4de2_ad20_76f3ca88fdf4.slice/crio-8b215dc1ab3b867b941a56f2937e09711f731435059e18c6dcd75110bfe4676e WatchSource:0}: Error finding container 8b215dc1ab3b867b941a56f2937e09711f731435059e18c6dcd75110bfe4676e: Status 404 returned error can't find the container with id 8b215dc1ab3b867b941a56f2937e09711f731435059e18c6dcd75110bfe4676e Apr 24 21:17:54.331228 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:54.331176 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4f4tq" event={"ID":"721680e7-1c21-4de2-ad20-76f3ca88fdf4","Type":"ContainerStarted","Data":"8b215dc1ab3b867b941a56f2937e09711f731435059e18c6dcd75110bfe4676e"} Apr 24 21:17:54.337302 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:54.337270 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:17:54.344900 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:54.344773 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qfs2f"] Apr 24 21:17:54.347009 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:54.346978 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode42a906f_a474_4d11_8a0d_e8ef290b2e14.slice/crio-5d876c0b053efa254932e58671c352c4e5f80a79adbfda74efdf78f1f3ce44b7 WatchSource:0}: Error finding container 5d876c0b053efa254932e58671c352c4e5f80a79adbfda74efdf78f1f3ce44b7: Status 404 returned error can't find the container with id 5d876c0b053efa254932e58671c352c4e5f80a79adbfda74efdf78f1f3ce44b7 Apr 24 21:17:54.365830 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:54.365799 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qm8hh"] Apr 24 21:17:54.369264 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:54.369231 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3b8d86d_c179_4368_b025_0c7f41b2aa3e.slice/crio-073cb6538b366d8a9fd632c80dbf2317230576a8c75580048e30f3165587a061 WatchSource:0}: Error finding container 073cb6538b366d8a9fd632c80dbf2317230576a8c75580048e30f3165587a061: Status 404 returned error can't find the container with id 073cb6538b366d8a9fd632c80dbf2317230576a8c75580048e30f3165587a061 Apr 24 21:17:55.336463 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:55.336374 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qm8hh" event={"ID":"f3b8d86d-c179-4368-b025-0c7f41b2aa3e","Type":"ContainerStarted","Data":"073cb6538b366d8a9fd632c80dbf2317230576a8c75580048e30f3165587a061"} Apr 24 21:17:55.338306 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:55.338262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qfs2f" event={"ID":"e42a906f-a474-4d11-8a0d-e8ef290b2e14","Type":"ContainerStarted","Data":"5d876c0b053efa254932e58671c352c4e5f80a79adbfda74efdf78f1f3ce44b7"} Apr 24 21:17:55.340951 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:55.340905 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-jpt2j" event={"ID":"3aed0d1c-5d90-4a85-912d-d13220c855e2","Type":"ContainerStarted","Data":"7e9583fbea45f92a1003b5012fbaa3867edc6867ca93e8aa1deb8d64298f905a"} Apr 24 21:17:55.341342 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:55.341240 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-jpt2j" Apr 24 21:17:55.356798 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:55.356721 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-jpt2j" Apr 24 21:17:55.362625 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:55.362559 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-jpt2j" podStartSLOduration=2.111890614 podStartE2EDuration="18.362538043s" podCreationTimestamp="2026-04-24 21:17:37 +0000 UTC" firstStartedPulling="2026-04-24 21:17:38.152679092 +0000 UTC m=+56.753834636" lastFinishedPulling="2026-04-24 21:17:54.403326523 +0000 UTC m=+73.004482065" observedRunningTime="2026-04-24 21:17:55.358856013 +0000 UTC m=+73.960011573" watchObservedRunningTime="2026-04-24 21:17:55.362538043 +0000 UTC m=+73.963693604" Apr 24 21:17:56.213031 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.211804 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bc6f8dcf7-zx6db"] Apr 24 21:17:56.237141 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.236644 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bc6f8dcf7-zx6db"] Apr 24 21:17:56.237141 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.236791 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.245931 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.245752 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:17:56.290399 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.290302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-trusted-ca-bundle\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.290399 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.290360 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqcjr\" (UniqueName: \"kubernetes.io/projected/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-kube-api-access-tqcjr\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.290669 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.290468 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-service-ca\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.290669 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.290542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-config\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.290669 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.290569 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-oauth-config\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.290669 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.290599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-serving-cert\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.290669 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.290623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-oauth-serving-cert\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.347722 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.347585 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qm8hh" event={"ID":"f3b8d86d-c179-4368-b025-0c7f41b2aa3e","Type":"ContainerStarted","Data":"7c6d03a965beb2278de26f1ff503caf25907158a59ea3dbbdd168da6a6a32eea"} Apr 24 21:17:56.347722 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.347632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qm8hh" event={"ID":"f3b8d86d-c179-4368-b025-0c7f41b2aa3e","Type":"ContainerStarted","Data":"eb3beb00c8126a2bae10f939e0146dd31bc5f6726347b2a957dea78ae0541c54"} Apr 24 21:17:56.350754 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.350728 2575 generic.go:358] "Generic (PLEG): container finished" podID="721680e7-1c21-4de2-ad20-76f3ca88fdf4" containerID="f90157c6ebdf3303cf798ee3df4bc4f5a9a9fd589fcc635da7b65d38f3f22b65" exitCode=0 Apr 24 21:17:56.350888 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.350824 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4f4tq" event={"ID":"721680e7-1c21-4de2-ad20-76f3ca88fdf4","Type":"ContainerDied","Data":"f90157c6ebdf3303cf798ee3df4bc4f5a9a9fd589fcc635da7b65d38f3f22b65"} Apr 24 21:17:56.374593 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.374015 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qm8hh" podStartSLOduration=72.988656351 podStartE2EDuration="1m14.373995329s" podCreationTimestamp="2026-04-24 21:16:42 +0000 UTC" firstStartedPulling="2026-04-24 21:17:54.37158629 +0000 UTC m=+72.972741844" lastFinishedPulling="2026-04-24 21:17:55.756925283 +0000 UTC m=+74.358080822" observedRunningTime="2026-04-24 21:17:56.372852862 +0000 UTC m=+74.974008443" watchObservedRunningTime="2026-04-24 21:17:56.373995329 +0000 UTC m=+74.975150891" Apr 24 21:17:56.391815 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.391681 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-service-ca\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.391815 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.391768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-config\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.393091 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.392139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-oauth-config\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.393091 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.392185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-serving-cert\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.393091 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.392209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-oauth-serving-cert\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.393091 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.392263 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-trusted-ca-bundle\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.393091 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.392293 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqcjr\" (UniqueName: \"kubernetes.io/projected/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-kube-api-access-tqcjr\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.393091 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.392491 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-service-ca\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.393091 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.392526 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-config\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.393696 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.393652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-oauth-serving-cert\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.394431 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.394410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-trusted-ca-bundle\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.404691 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.404619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-oauth-config\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.405420 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.405296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-serving-cert\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.409236 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.406663 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqcjr\" (UniqueName: \"kubernetes.io/projected/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-kube-api-access-tqcjr\") pod \"console-6bc6f8dcf7-zx6db\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.551296 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.551262 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:17:56.728944 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:56.728883 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bc6f8dcf7-zx6db"] Apr 24 21:17:56.733042 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:17:56.733008 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c6d61c4_fcb9_4d19_884e_e3fba6b76a7b.slice/crio-b71639cc64e62e2de7c07a2dd67ccd672f253e077dfb3e40b2b0d14088735426 WatchSource:0}: Error finding container b71639cc64e62e2de7c07a2dd67ccd672f253e077dfb3e40b2b0d14088735426: Status 404 returned error can't find the container with id b71639cc64e62e2de7c07a2dd67ccd672f253e077dfb3e40b2b0d14088735426 Apr 24 21:17:57.360346 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:57.360300 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4f4tq" event={"ID":"721680e7-1c21-4de2-ad20-76f3ca88fdf4","Type":"ContainerStarted","Data":"d8ef2fddf94ca9ac27f32b276692fd93fad829607214814150026b01e33b468b"} Apr 24 21:17:57.360346 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:57.360345 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4f4tq" event={"ID":"721680e7-1c21-4de2-ad20-76f3ca88fdf4","Type":"ContainerStarted","Data":"e291c65f0c72961d20a6f23cf645f515a3addf291879c9d644a6247d96013c0c"} Apr 24 21:17:57.366156 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:57.366117 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bc6f8dcf7-zx6db" event={"ID":"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b","Type":"ContainerStarted","Data":"a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4"} Apr 24 21:17:57.366344 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:57.366179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bc6f8dcf7-zx6db" event={"ID":"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b","Type":"ContainerStarted","Data":"b71639cc64e62e2de7c07a2dd67ccd672f253e077dfb3e40b2b0d14088735426"} Apr 24 21:17:57.386592 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:57.386531 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4f4tq" podStartSLOduration=9.857604258 podStartE2EDuration="11.386505506s" podCreationTimestamp="2026-04-24 21:17:46 +0000 UTC" firstStartedPulling="2026-04-24 21:17:54.224725769 +0000 UTC m=+72.825881310" lastFinishedPulling="2026-04-24 21:17:55.753627005 +0000 UTC m=+74.354782558" observedRunningTime="2026-04-24 21:17:57.384871321 +0000 UTC m=+75.986026883" watchObservedRunningTime="2026-04-24 21:17:57.386505506 +0000 UTC m=+75.987661078" Apr 24 21:17:57.406860 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:57.406806 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bc6f8dcf7-zx6db" podStartSLOduration=1.406782053 podStartE2EDuration="1.406782053s" podCreationTimestamp="2026-04-24 21:17:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:17:57.403994298 +0000 UTC m=+76.005149859" watchObservedRunningTime="2026-04-24 21:17:57.406782053 +0000 UTC m=+76.007937613" Apr 24 21:17:59.372520 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:59.372476 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qfs2f" event={"ID":"e42a906f-a474-4d11-8a0d-e8ef290b2e14","Type":"ContainerStarted","Data":"a3e6ecc354bed5e46b59af3e5cf33f9fd6dd4d391e9eff12e5a829ce42b333df"} Apr 24 21:17:59.372975 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:59.372611 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:17:59.390416 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:17:59.389815 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qfs2f" podStartSLOduration=72.990470571 podStartE2EDuration="1m17.389783499s" podCreationTimestamp="2026-04-24 21:16:42 +0000 UTC" firstStartedPulling="2026-04-24 21:17:54.349727264 +0000 UTC m=+72.950882803" lastFinishedPulling="2026-04-24 21:17:58.749040174 +0000 UTC m=+77.350195731" observedRunningTime="2026-04-24 21:17:59.388942215 +0000 UTC m=+77.990097799" watchObservedRunningTime="2026-04-24 21:17:59.389783499 +0000 UTC m=+77.990939060" Apr 24 21:18:00.287012 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.286981 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-74b659f8d6-f49ft" Apr 24 21:18:00.381580 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.381546 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bc6f8dcf7-zx6db"] Apr 24 21:18:00.411080 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.411025 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-556cbcdb8b-ksns7"] Apr 24 21:18:00.434212 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.434182 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-556cbcdb8b-ksns7"] Apr 24 21:18:00.434381 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.434313 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.531039 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.531005 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-console-config\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.531236 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.531097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-trusted-ca-bundle\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.531236 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.531124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcwlv\" (UniqueName: \"kubernetes.io/projected/38204513-e6bf-4005-bbba-ef622f4d314e-kube-api-access-fcwlv\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.531236 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.531152 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-oauth-serving-cert\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.531236 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.531185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/38204513-e6bf-4005-bbba-ef622f4d314e-console-oauth-config\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.531433 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.531258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/38204513-e6bf-4005-bbba-ef622f4d314e-console-serving-cert\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.531433 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.531312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-service-ca\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.632178 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.632134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-service-ca\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.632381 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.632238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-console-config\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.632381 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.632273 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-trusted-ca-bundle\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.632381 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.632291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcwlv\" (UniqueName: \"kubernetes.io/projected/38204513-e6bf-4005-bbba-ef622f4d314e-kube-api-access-fcwlv\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.632381 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.632312 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-oauth-serving-cert\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.632381 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.632344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/38204513-e6bf-4005-bbba-ef622f4d314e-console-oauth-config\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.632381 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.632381 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/38204513-e6bf-4005-bbba-ef622f4d314e-console-serving-cert\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.633043 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.632991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-service-ca\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.633043 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.633004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-console-config\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.633243 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.633092 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-oauth-serving-cert\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.633437 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.633410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-trusted-ca-bundle\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.635259 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.635241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/38204513-e6bf-4005-bbba-ef622f4d314e-console-serving-cert\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.635354 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.635241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/38204513-e6bf-4005-bbba-ef622f4d314e-console-oauth-config\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.640825 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.640794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcwlv\" (UniqueName: \"kubernetes.io/projected/38204513-e6bf-4005-bbba-ef622f4d314e-kube-api-access-fcwlv\") pod \"console-556cbcdb8b-ksns7\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.746271 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.746230 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:00.888598 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:00.888542 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-556cbcdb8b-ksns7"] Apr 24 21:18:00.898161 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:18:00.898123 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38204513_e6bf_4005_bbba_ef622f4d314e.slice/crio-e9105ee4dec185bfabef94f44641d39cd7c61db65c23e1e25a0f205f22af0f68 WatchSource:0}: Error finding container e9105ee4dec185bfabef94f44641d39cd7c61db65c23e1e25a0f205f22af0f68: Status 404 returned error can't find the container with id e9105ee4dec185bfabef94f44641d39cd7c61db65c23e1e25a0f205f22af0f68 Apr 24 21:18:01.382243 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:01.382201 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556cbcdb8b-ksns7" event={"ID":"38204513-e6bf-4005-bbba-ef622f4d314e","Type":"ContainerStarted","Data":"f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04"} Apr 24 21:18:01.382243 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:01.382246 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556cbcdb8b-ksns7" event={"ID":"38204513-e6bf-4005-bbba-ef622f4d314e","Type":"ContainerStarted","Data":"e9105ee4dec185bfabef94f44641d39cd7c61db65c23e1e25a0f205f22af0f68"} Apr 24 21:18:01.402458 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:01.402408 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-556cbcdb8b-ksns7" podStartSLOduration=1.402388792 podStartE2EDuration="1.402388792s" podCreationTimestamp="2026-04-24 21:18:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:18:01.400990378 +0000 UTC m=+80.002145980" watchObservedRunningTime="2026-04-24 21:18:01.402388792 +0000 UTC m=+80.003544353" Apr 24 21:18:02.652720 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:02.652681 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" podUID="66ff2b4e-2b04-4b4e-9761-f1430f4296de" containerName="registry" containerID="cri-o://74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6" gracePeriod=30 Apr 24 21:18:02.936096 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:02.936047 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:18:03.055956 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.055921 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66ff2b4e-2b04-4b4e-9761-f1430f4296de-trusted-ca\") pod \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " Apr 24 21:18:03.055956 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.055964 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls\") pod \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " Apr 24 21:18:03.056246 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.055994 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66ff2b4e-2b04-4b4e-9761-f1430f4296de-ca-trust-extracted\") pod \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " Apr 24 21:18:03.056246 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.056020 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjhx9\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-kube-api-access-zjhx9\") pod \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " Apr 24 21:18:03.056246 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.056059 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/66ff2b4e-2b04-4b4e-9761-f1430f4296de-image-registry-private-configuration\") pod \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " Apr 24 21:18:03.056246 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.056103 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-bound-sa-token\") pod \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " Apr 24 21:18:03.056246 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.056148 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66ff2b4e-2b04-4b4e-9761-f1430f4296de-installation-pull-secrets\") pod \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " Apr 24 21:18:03.056246 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.056169 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-certificates\") pod \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\" (UID: \"66ff2b4e-2b04-4b4e-9761-f1430f4296de\") " Apr 24 21:18:03.056565 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.056380 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ff2b4e-2b04-4b4e-9761-f1430f4296de-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "66ff2b4e-2b04-4b4e-9761-f1430f4296de" (UID: "66ff2b4e-2b04-4b4e-9761-f1430f4296de"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:03.056897 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.056673 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "66ff2b4e-2b04-4b4e-9761-f1430f4296de" (UID: "66ff2b4e-2b04-4b4e-9761-f1430f4296de"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:03.058763 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.058735 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "66ff2b4e-2b04-4b4e-9761-f1430f4296de" (UID: "66ff2b4e-2b04-4b4e-9761-f1430f4296de"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:03.058991 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.058953 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "66ff2b4e-2b04-4b4e-9761-f1430f4296de" (UID: "66ff2b4e-2b04-4b4e-9761-f1430f4296de"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:03.059082 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.058993 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ff2b4e-2b04-4b4e-9761-f1430f4296de-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "66ff2b4e-2b04-4b4e-9761-f1430f4296de" (UID: "66ff2b4e-2b04-4b4e-9761-f1430f4296de"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:03.059149 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.059078 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ff2b4e-2b04-4b4e-9761-f1430f4296de-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "66ff2b4e-2b04-4b4e-9761-f1430f4296de" (UID: "66ff2b4e-2b04-4b4e-9761-f1430f4296de"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:03.059584 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.059559 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-kube-api-access-zjhx9" (OuterVolumeSpecName: "kube-api-access-zjhx9") pod "66ff2b4e-2b04-4b4e-9761-f1430f4296de" (UID: "66ff2b4e-2b04-4b4e-9761-f1430f4296de"). InnerVolumeSpecName "kube-api-access-zjhx9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:03.067693 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.067663 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66ff2b4e-2b04-4b4e-9761-f1430f4296de-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "66ff2b4e-2b04-4b4e-9761-f1430f4296de" (UID: "66ff2b4e-2b04-4b4e-9761-f1430f4296de"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:18:03.157126 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.157084 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66ff2b4e-2b04-4b4e-9761-f1430f4296de-trusted-ca\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:03.157126 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.157118 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-tls\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:03.157126 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.157129 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/66ff2b4e-2b04-4b4e-9761-f1430f4296de-ca-trust-extracted\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:03.157426 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.157143 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zjhx9\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-kube-api-access-zjhx9\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:03.157426 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.157156 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/66ff2b4e-2b04-4b4e-9761-f1430f4296de-image-registry-private-configuration\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:03.157426 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.157166 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/66ff2b4e-2b04-4b4e-9761-f1430f4296de-bound-sa-token\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:03.157426 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.157174 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/66ff2b4e-2b04-4b4e-9761-f1430f4296de-installation-pull-secrets\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:03.157426 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.157186 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/66ff2b4e-2b04-4b4e-9761-f1430f4296de-registry-certificates\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:03.391701 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.391662 2575 generic.go:358] "Generic (PLEG): container finished" podID="66ff2b4e-2b04-4b4e-9761-f1430f4296de" containerID="74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6" exitCode=0 Apr 24 21:18:03.391890 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.391734 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" event={"ID":"66ff2b4e-2b04-4b4e-9761-f1430f4296de","Type":"ContainerDied","Data":"74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6"} Apr 24 21:18:03.391890 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.391770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" event={"ID":"66ff2b4e-2b04-4b4e-9761-f1430f4296de","Type":"ContainerDied","Data":"5b6f7cfbeb46184ece9082dbc015ada1598344c8e9e5c6d62d98960b7c8314b3"} Apr 24 21:18:03.391890 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.391742 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7649b9b64b-xms8x" Apr 24 21:18:03.391890 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.391791 2575 scope.go:117] "RemoveContainer" containerID="74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6" Apr 24 21:18:03.402563 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.402538 2575 scope.go:117] "RemoveContainer" containerID="74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6" Apr 24 21:18:03.402895 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:18:03.402870 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6\": container with ID starting with 74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6 not found: ID does not exist" containerID="74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6" Apr 24 21:18:03.402980 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.402903 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6"} err="failed to get container status \"74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6\": rpc error: code = NotFound desc = could not find container \"74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6\": container with ID starting with 74e8c25404e3e6cf1517c3547f31319e1cd67d943f96752d615b64845655ddf6 not found: ID does not exist" Apr 24 21:18:03.422938 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.422906 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7649b9b64b-xms8x"] Apr 24 21:18:03.433903 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:03.433871 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7649b9b64b-xms8x"] Apr 24 21:18:04.006282 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:04.006242 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ff2b4e-2b04-4b4e-9761-f1430f4296de" path="/var/lib/kubelet/pods/66ff2b4e-2b04-4b4e-9761-f1430f4296de/volumes" Apr 24 21:18:06.552465 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:06.552428 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:18:10.746583 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:10.746542 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:10.746583 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:10.746591 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:10.751621 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:10.751599 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:11.420506 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:11.420479 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:18:11.471043 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:11.471009 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9b6d99dc7-2lxqr"] Apr 24 21:18:25.403489 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.403432 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6bc6f8dcf7-zx6db" podUID="8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" containerName="console" containerID="cri-o://a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4" gracePeriod=15 Apr 24 21:18:25.679432 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.679406 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bc6f8dcf7-zx6db_8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b/console/0.log" Apr 24 21:18:25.679541 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.679472 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:18:25.725922 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.725884 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-trusted-ca-bundle\") pod \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " Apr 24 21:18:25.725922 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.725924 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-serving-cert\") pod \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " Apr 24 21:18:25.726147 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.725959 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-service-ca\") pod \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " Apr 24 21:18:25.726147 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.726127 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqcjr\" (UniqueName: \"kubernetes.io/projected/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-kube-api-access-tqcjr\") pod \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " Apr 24 21:18:25.726233 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.726179 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-oauth-config\") pod \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " Apr 24 21:18:25.726286 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.726246 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-config\") pod \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " Apr 24 21:18:25.726286 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.726276 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-oauth-serving-cert\") pod \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\" (UID: \"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b\") " Apr 24 21:18:25.726380 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.726294 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" (UID: "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:25.726431 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.726382 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-service-ca" (OuterVolumeSpecName: "service-ca") pod "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" (UID: "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:25.726537 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.726518 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-trusted-ca-bundle\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.726590 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.726549 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-service-ca\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.726729 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.726706 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-config" (OuterVolumeSpecName: "console-config") pod "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" (UID: "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:25.726729 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.726720 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" (UID: "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:25.728359 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.728323 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" (UID: "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:25.728359 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.728336 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-kube-api-access-tqcjr" (OuterVolumeSpecName: "kube-api-access-tqcjr") pod "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" (UID: "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b"). InnerVolumeSpecName "kube-api-access-tqcjr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:25.728513 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.728404 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" (UID: "8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:25.827001 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.826961 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tqcjr\" (UniqueName: \"kubernetes.io/projected/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-kube-api-access-tqcjr\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.827001 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.826996 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-oauth-config\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.827217 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.827020 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-config\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.827217 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.827029 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-oauth-serving-cert\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:25.827217 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:25.827047 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b-console-serving-cert\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:26.459741 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:26.459709 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bc6f8dcf7-zx6db_8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b/console/0.log" Apr 24 21:18:26.460171 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:26.459754 2575 generic.go:358] "Generic (PLEG): container finished" podID="8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" containerID="a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4" exitCode=2 Apr 24 21:18:26.460171 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:26.459790 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bc6f8dcf7-zx6db" event={"ID":"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b","Type":"ContainerDied","Data":"a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4"} Apr 24 21:18:26.460171 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:26.459810 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bc6f8dcf7-zx6db" Apr 24 21:18:26.460171 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:26.459819 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bc6f8dcf7-zx6db" event={"ID":"8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b","Type":"ContainerDied","Data":"b71639cc64e62e2de7c07a2dd67ccd672f253e077dfb3e40b2b0d14088735426"} Apr 24 21:18:26.460171 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:26.459841 2575 scope.go:117] "RemoveContainer" containerID="a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4" Apr 24 21:18:26.467096 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:26.467077 2575 scope.go:117] "RemoveContainer" containerID="a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4" Apr 24 21:18:26.467345 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:18:26.467325 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4\": container with ID starting with a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4 not found: ID does not exist" containerID="a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4" Apr 24 21:18:26.467421 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:26.467354 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4"} err="failed to get container status \"a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4\": rpc error: code = NotFound desc = could not find container \"a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4\": container with ID starting with a680fe86828cb4200944f03da77159b68c441e56a92fe678d4f57e2104d6c2a4 not found: ID does not exist" Apr 24 21:18:26.476988 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:26.476962 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bc6f8dcf7-zx6db"] Apr 24 21:18:26.485585 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:26.485559 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bc6f8dcf7-zx6db"] Apr 24 21:18:28.006532 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:28.006485 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" path="/var/lib/kubelet/pods/8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b/volumes" Apr 24 21:18:30.379030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:30.378999 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qfs2f" Apr 24 21:18:36.489787 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.489726 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9b6d99dc7-2lxqr" podUID="c1c7be9e-6e02-4d37-9210-a1b659d4bea5" containerName="console" containerID="cri-o://11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43" gracePeriod=15 Apr 24 21:18:36.760652 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.760627 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9b6d99dc7-2lxqr_c1c7be9e-6e02-4d37-9210-a1b659d4bea5/console/0.log" Apr 24 21:18:36.760796 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.760699 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:18:36.813373 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.813340 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-config\") pod \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " Apr 24 21:18:36.813536 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.813396 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-serving-cert\") pod \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " Apr 24 21:18:36.813536 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.813418 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-oauth-serving-cert\") pod \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " Apr 24 21:18:36.813536 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.813441 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-service-ca\") pod \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " Apr 24 21:18:36.813536 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.813465 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh859\" (UniqueName: \"kubernetes.io/projected/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-kube-api-access-jh859\") pod \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " Apr 24 21:18:36.813715 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.813608 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-oauth-config\") pod \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\" (UID: \"c1c7be9e-6e02-4d37-9210-a1b659d4bea5\") " Apr 24 21:18:36.813870 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.813828 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c1c7be9e-6e02-4d37-9210-a1b659d4bea5" (UID: "c1c7be9e-6e02-4d37-9210-a1b659d4bea5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:36.813870 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.813835 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-config" (OuterVolumeSpecName: "console-config") pod "c1c7be9e-6e02-4d37-9210-a1b659d4bea5" (UID: "c1c7be9e-6e02-4d37-9210-a1b659d4bea5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:36.814057 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.813896 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-service-ca" (OuterVolumeSpecName: "service-ca") pod "c1c7be9e-6e02-4d37-9210-a1b659d4bea5" (UID: "c1c7be9e-6e02-4d37-9210-a1b659d4bea5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:18:36.815756 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.815703 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c1c7be9e-6e02-4d37-9210-a1b659d4bea5" (UID: "c1c7be9e-6e02-4d37-9210-a1b659d4bea5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:36.815756 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.815743 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c1c7be9e-6e02-4d37-9210-a1b659d4bea5" (UID: "c1c7be9e-6e02-4d37-9210-a1b659d4bea5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:18:36.815875 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.815746 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-kube-api-access-jh859" (OuterVolumeSpecName: "kube-api-access-jh859") pod "c1c7be9e-6e02-4d37-9210-a1b659d4bea5" (UID: "c1c7be9e-6e02-4d37-9210-a1b659d4bea5"). InnerVolumeSpecName "kube-api-access-jh859". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:18:36.914564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.914531 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-config\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:36.914564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.914557 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-serving-cert\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:36.914564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.914567 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-oauth-serving-cert\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:36.914564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.914575 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-service-ca\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:36.914810 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.914585 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jh859\" (UniqueName: \"kubernetes.io/projected/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-kube-api-access-jh859\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:36.914810 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:36.914594 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1c7be9e-6e02-4d37-9210-a1b659d4bea5-console-oauth-config\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:18:37.492040 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:37.492010 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9b6d99dc7-2lxqr_c1c7be9e-6e02-4d37-9210-a1b659d4bea5/console/0.log" Apr 24 21:18:37.492526 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:37.492049 2575 generic.go:358] "Generic (PLEG): container finished" podID="c1c7be9e-6e02-4d37-9210-a1b659d4bea5" containerID="11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43" exitCode=2 Apr 24 21:18:37.492526 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:37.492096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9b6d99dc7-2lxqr" event={"ID":"c1c7be9e-6e02-4d37-9210-a1b659d4bea5","Type":"ContainerDied","Data":"11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43"} Apr 24 21:18:37.492526 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:37.492140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9b6d99dc7-2lxqr" event={"ID":"c1c7be9e-6e02-4d37-9210-a1b659d4bea5","Type":"ContainerDied","Data":"e7e571653bf1ab1f49809ea3960ece0237100ad37e61b954858dcffb48f15899"} Apr 24 21:18:37.492526 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:37.492144 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9b6d99dc7-2lxqr" Apr 24 21:18:37.492526 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:37.492159 2575 scope.go:117] "RemoveContainer" containerID="11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43" Apr 24 21:18:37.500393 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:37.500370 2575 scope.go:117] "RemoveContainer" containerID="11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43" Apr 24 21:18:37.500669 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:18:37.500644 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43\": container with ID starting with 11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43 not found: ID does not exist" containerID="11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43" Apr 24 21:18:37.500725 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:37.500682 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43"} err="failed to get container status \"11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43\": rpc error: code = NotFound desc = could not find container \"11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43\": container with ID starting with 11591380a192c184cf9487b15fa9d7d6b98e2b19b01ce31f9176cf2ca2dabd43 not found: ID does not exist" Apr 24 21:18:37.512374 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:37.512348 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9b6d99dc7-2lxqr"] Apr 24 21:18:37.516411 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:37.516389 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9b6d99dc7-2lxqr"] Apr 24 21:18:38.005548 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:18:38.005512 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1c7be9e-6e02-4d37-9210-a1b659d4bea5" path="/var/lib/kubelet/pods/c1c7be9e-6e02-4d37-9210-a1b659d4bea5/volumes" Apr 24 21:19:05.412922 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.412886 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b54c958b4-lsfg5"] Apr 24 21:19:05.413367 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.413179 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66ff2b4e-2b04-4b4e-9761-f1430f4296de" containerName="registry" Apr 24 21:19:05.413367 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.413192 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ff2b4e-2b04-4b4e-9761-f1430f4296de" containerName="registry" Apr 24 21:19:05.413367 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.413223 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1c7be9e-6e02-4d37-9210-a1b659d4bea5" containerName="console" Apr 24 21:19:05.413367 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.413232 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c7be9e-6e02-4d37-9210-a1b659d4bea5" containerName="console" Apr 24 21:19:05.413367 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.413241 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" containerName="console" Apr 24 21:19:05.413367 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.413248 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" containerName="console" Apr 24 21:19:05.413367 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.413285 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1c7be9e-6e02-4d37-9210-a1b659d4bea5" containerName="console" Apr 24 21:19:05.413367 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.413293 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c6d61c4-fcb9-4d19-884e-e3fba6b76a7b" containerName="console" Apr 24 21:19:05.413367 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.413299 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="66ff2b4e-2b04-4b4e-9761-f1430f4296de" containerName="registry" Apr 24 21:19:05.417448 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.417431 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.426036 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.426006 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b54c958b4-lsfg5"] Apr 24 21:19:05.525637 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.525590 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-oauth-config\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.525637 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.525640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-oauth-serving-cert\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.525900 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.525686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-serving-cert\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.525900 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.525711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-service-ca\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.525900 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.525794 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-trusted-ca-bundle\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.525900 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.525840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9hhk\" (UniqueName: \"kubernetes.io/projected/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-kube-api-access-n9hhk\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.525900 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.525887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-config\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.626618 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.626568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9hhk\" (UniqueName: \"kubernetes.io/projected/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-kube-api-access-n9hhk\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.626618 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.626622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-config\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.626886 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.626660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-oauth-config\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.626886 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.626676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-oauth-serving-cert\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.626886 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.626699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-serving-cert\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.626886 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.626772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-service-ca\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.626886 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.626834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-trusted-ca-bundle\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.627622 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.627591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-service-ca\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.627736 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.627591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-config\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.627736 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.627655 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-oauth-serving-cert\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.627736 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.627698 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-trusted-ca-bundle\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.629668 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.629647 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-serving-cert\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.629781 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.629760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-oauth-config\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.634662 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.634638 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9hhk\" (UniqueName: \"kubernetes.io/projected/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-kube-api-access-n9hhk\") pod \"console-6b54c958b4-lsfg5\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.726397 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.726306 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:05.872556 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:05.872511 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b54c958b4-lsfg5"] Apr 24 21:19:05.875750 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:19:05.875710 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1870a54_cb8c_4a45_ab0f_3a74da0062cd.slice/crio-2932710e2ddfbc1e9c6d417f60950e402e590bc7385cdddf71efe331a592e3b4 WatchSource:0}: Error finding container 2932710e2ddfbc1e9c6d417f60950e402e590bc7385cdddf71efe331a592e3b4: Status 404 returned error can't find the container with id 2932710e2ddfbc1e9c6d417f60950e402e590bc7385cdddf71efe331a592e3b4 Apr 24 21:19:06.568593 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:06.568555 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b54c958b4-lsfg5" event={"ID":"c1870a54-cb8c-4a45-ab0f-3a74da0062cd","Type":"ContainerStarted","Data":"98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e"} Apr 24 21:19:06.568593 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:06.568596 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b54c958b4-lsfg5" event={"ID":"c1870a54-cb8c-4a45-ab0f-3a74da0062cd","Type":"ContainerStarted","Data":"2932710e2ddfbc1e9c6d417f60950e402e590bc7385cdddf71efe331a592e3b4"} Apr 24 21:19:06.587075 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:06.587007 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b54c958b4-lsfg5" podStartSLOduration=1.586994329 podStartE2EDuration="1.586994329s" podCreationTimestamp="2026-04-24 21:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:19:06.586610857 +0000 UTC m=+145.187766452" watchObservedRunningTime="2026-04-24 21:19:06.586994329 +0000 UTC m=+145.188149890" Apr 24 21:19:15.727474 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:15.727435 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:15.727474 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:15.727474 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:15.732008 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:15.731987 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:16.602956 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:16.602928 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:19:16.646785 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:16.646749 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-556cbcdb8b-ksns7"] Apr 24 21:19:41.668984 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:41.668890 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-556cbcdb8b-ksns7" podUID="38204513-e6bf-4005-bbba-ef622f4d314e" containerName="console" containerID="cri-o://f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04" gracePeriod=15 Apr 24 21:19:41.899893 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:41.899870 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-556cbcdb8b-ksns7_38204513-e6bf-4005-bbba-ef622f4d314e/console/0.log" Apr 24 21:19:41.900024 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:41.899935 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:19:42.009530 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.009457 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-service-ca\") pod \"38204513-e6bf-4005-bbba-ef622f4d314e\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " Apr 24 21:19:42.009530 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.009508 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/38204513-e6bf-4005-bbba-ef622f4d314e-console-serving-cert\") pod \"38204513-e6bf-4005-bbba-ef622f4d314e\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " Apr 24 21:19:42.009703 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.009551 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcwlv\" (UniqueName: \"kubernetes.io/projected/38204513-e6bf-4005-bbba-ef622f4d314e-kube-api-access-fcwlv\") pod \"38204513-e6bf-4005-bbba-ef622f4d314e\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " Apr 24 21:19:42.009703 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.009567 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-oauth-serving-cert\") pod \"38204513-e6bf-4005-bbba-ef622f4d314e\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " Apr 24 21:19:42.009703 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.009586 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-trusted-ca-bundle\") pod \"38204513-e6bf-4005-bbba-ef622f4d314e\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " Apr 24 21:19:42.009703 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.009610 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-console-config\") pod \"38204513-e6bf-4005-bbba-ef622f4d314e\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " Apr 24 21:19:42.009890 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.009711 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/38204513-e6bf-4005-bbba-ef622f4d314e-console-oauth-config\") pod \"38204513-e6bf-4005-bbba-ef622f4d314e\" (UID: \"38204513-e6bf-4005-bbba-ef622f4d314e\") " Apr 24 21:19:42.009950 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.009920 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-service-ca" (OuterVolumeSpecName: "service-ca") pod "38204513-e6bf-4005-bbba-ef622f4d314e" (UID: "38204513-e6bf-4005-bbba-ef622f4d314e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:42.010005 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.009987 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "38204513-e6bf-4005-bbba-ef622f4d314e" (UID: "38204513-e6bf-4005-bbba-ef622f4d314e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:42.010085 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.009999 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-console-config" (OuterVolumeSpecName: "console-config") pod "38204513-e6bf-4005-bbba-ef622f4d314e" (UID: "38204513-e6bf-4005-bbba-ef622f4d314e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:42.010085 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.010007 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "38204513-e6bf-4005-bbba-ef622f4d314e" (UID: "38204513-e6bf-4005-bbba-ef622f4d314e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:19:42.010255 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.010237 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-oauth-serving-cert\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:19:42.010318 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.010263 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-trusted-ca-bundle\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:19:42.010318 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.010278 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-console-config\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:19:42.010318 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.010295 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/38204513-e6bf-4005-bbba-ef622f4d314e-service-ca\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:19:42.011745 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.011721 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38204513-e6bf-4005-bbba-ef622f4d314e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "38204513-e6bf-4005-bbba-ef622f4d314e" (UID: "38204513-e6bf-4005-bbba-ef622f4d314e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:42.011814 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.011740 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38204513-e6bf-4005-bbba-ef622f4d314e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "38204513-e6bf-4005-bbba-ef622f4d314e" (UID: "38204513-e6bf-4005-bbba-ef622f4d314e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:19:42.011859 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.011844 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38204513-e6bf-4005-bbba-ef622f4d314e-kube-api-access-fcwlv" (OuterVolumeSpecName: "kube-api-access-fcwlv") pod "38204513-e6bf-4005-bbba-ef622f4d314e" (UID: "38204513-e6bf-4005-bbba-ef622f4d314e"). InnerVolumeSpecName "kube-api-access-fcwlv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:19:42.110867 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.110833 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/38204513-e6bf-4005-bbba-ef622f4d314e-console-serving-cert\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:19:42.110867 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.110864 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fcwlv\" (UniqueName: \"kubernetes.io/projected/38204513-e6bf-4005-bbba-ef622f4d314e-kube-api-access-fcwlv\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:19:42.110867 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.110874 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/38204513-e6bf-4005-bbba-ef622f4d314e-console-oauth-config\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:19:42.667867 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.667840 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-556cbcdb8b-ksns7_38204513-e6bf-4005-bbba-ef622f4d314e/console/0.log" Apr 24 21:19:42.668037 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.667878 2575 generic.go:358] "Generic (PLEG): container finished" podID="38204513-e6bf-4005-bbba-ef622f4d314e" containerID="f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04" exitCode=2 Apr 24 21:19:42.668037 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.667952 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556cbcdb8b-ksns7" Apr 24 21:19:42.668037 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.667962 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556cbcdb8b-ksns7" event={"ID":"38204513-e6bf-4005-bbba-ef622f4d314e","Type":"ContainerDied","Data":"f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04"} Apr 24 21:19:42.668037 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.667988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556cbcdb8b-ksns7" event={"ID":"38204513-e6bf-4005-bbba-ef622f4d314e","Type":"ContainerDied","Data":"e9105ee4dec185bfabef94f44641d39cd7c61db65c23e1e25a0f205f22af0f68"} Apr 24 21:19:42.668037 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.668004 2575 scope.go:117] "RemoveContainer" containerID="f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04" Apr 24 21:19:42.676089 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.675920 2575 scope.go:117] "RemoveContainer" containerID="f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04" Apr 24 21:19:42.676298 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:19:42.676248 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04\": container with ID starting with f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04 not found: ID does not exist" containerID="f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04" Apr 24 21:19:42.676298 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.676274 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04"} err="failed to get container status \"f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04\": rpc error: code = NotFound desc = could not find container \"f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04\": container with ID starting with f76b7ffc96bccabef1ad5dfe0babf0187322fb181ebfbd083cb3f09175326a04 not found: ID does not exist" Apr 24 21:19:42.688496 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.688467 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-556cbcdb8b-ksns7"] Apr 24 21:19:42.692946 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:42.692920 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-556cbcdb8b-ksns7"] Apr 24 21:19:44.005705 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:19:44.005671 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38204513-e6bf-4005-bbba-ef622f4d314e" path="/var/lib/kubelet/pods/38204513-e6bf-4005-bbba-ef622f4d314e/volumes" Apr 24 21:20:13.553373 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.553334 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr"] Apr 24 21:20:13.553959 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.553713 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38204513-e6bf-4005-bbba-ef622f4d314e" containerName="console" Apr 24 21:20:13.553959 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.553732 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="38204513-e6bf-4005-bbba-ef622f4d314e" containerName="console" Apr 24 21:20:13.553959 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.553794 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="38204513-e6bf-4005-bbba-ef622f4d314e" containerName="console" Apr 24 21:20:13.556793 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.556770 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:13.559086 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.559050 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:20:13.559211 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.559193 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-npgq7\"" Apr 24 21:20:13.559961 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.559942 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:20:13.564859 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.564834 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr"] Apr 24 21:20:13.635930 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.635891 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/150dab05-11f9-44e4-b319-8f008d203f98-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr\" (UID: \"150dab05-11f9-44e4-b319-8f008d203f98\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:13.635930 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.635929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/150dab05-11f9-44e4-b319-8f008d203f98-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr\" (UID: \"150dab05-11f9-44e4-b319-8f008d203f98\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:13.636206 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.635952 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7t9s\" (UniqueName: \"kubernetes.io/projected/150dab05-11f9-44e4-b319-8f008d203f98-kube-api-access-d7t9s\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr\" (UID: \"150dab05-11f9-44e4-b319-8f008d203f98\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:13.736354 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.736322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/150dab05-11f9-44e4-b319-8f008d203f98-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr\" (UID: \"150dab05-11f9-44e4-b319-8f008d203f98\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:13.736354 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.736354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/150dab05-11f9-44e4-b319-8f008d203f98-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr\" (UID: \"150dab05-11f9-44e4-b319-8f008d203f98\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:13.736564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.736379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7t9s\" (UniqueName: \"kubernetes.io/projected/150dab05-11f9-44e4-b319-8f008d203f98-kube-api-access-d7t9s\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr\" (UID: \"150dab05-11f9-44e4-b319-8f008d203f98\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:13.736788 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.736765 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/150dab05-11f9-44e4-b319-8f008d203f98-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr\" (UID: \"150dab05-11f9-44e4-b319-8f008d203f98\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:13.736824 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.736776 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/150dab05-11f9-44e4-b319-8f008d203f98-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr\" (UID: \"150dab05-11f9-44e4-b319-8f008d203f98\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:13.744672 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.744642 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7t9s\" (UniqueName: \"kubernetes.io/projected/150dab05-11f9-44e4-b319-8f008d203f98-kube-api-access-d7t9s\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr\" (UID: \"150dab05-11f9-44e4-b319-8f008d203f98\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:13.866188 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.866102 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:13.983471 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:13.983440 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr"] Apr 24 21:20:13.986747 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:20:13.986719 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod150dab05_11f9_44e4_b319_8f008d203f98.slice/crio-963b2c5607fbbd40effd9a9ae48930371ec8d2d6cf821064d6d48a44a6a1f948 WatchSource:0}: Error finding container 963b2c5607fbbd40effd9a9ae48930371ec8d2d6cf821064d6d48a44a6a1f948: Status 404 returned error can't find the container with id 963b2c5607fbbd40effd9a9ae48930371ec8d2d6cf821064d6d48a44a6a1f948 Apr 24 21:20:14.756723 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:14.756687 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" event={"ID":"150dab05-11f9-44e4-b319-8f008d203f98","Type":"ContainerStarted","Data":"963b2c5607fbbd40effd9a9ae48930371ec8d2d6cf821064d6d48a44a6a1f948"} Apr 24 21:20:19.773385 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:19.773353 2575 generic.go:358] "Generic (PLEG): container finished" podID="150dab05-11f9-44e4-b319-8f008d203f98" containerID="58e0dad4cf5bc9b5850ee798f33ca6c8c40c4aefd9aa1a1264b3992148d3fe43" exitCode=0 Apr 24 21:20:19.773785 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:19.773404 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" event={"ID":"150dab05-11f9-44e4-b319-8f008d203f98","Type":"ContainerDied","Data":"58e0dad4cf5bc9b5850ee798f33ca6c8c40c4aefd9aa1a1264b3992148d3fe43"} Apr 24 21:20:22.782535 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:22.782496 2575 generic.go:358] "Generic (PLEG): container finished" podID="150dab05-11f9-44e4-b319-8f008d203f98" containerID="656bb6e950287ba0214675be468d32d9348173245c23a7b6fd424b16dc479087" exitCode=0 Apr 24 21:20:22.782919 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:22.782552 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" event={"ID":"150dab05-11f9-44e4-b319-8f008d203f98","Type":"ContainerDied","Data":"656bb6e950287ba0214675be468d32d9348173245c23a7b6fd424b16dc479087"} Apr 24 21:20:28.802389 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:28.802347 2575 generic.go:358] "Generic (PLEG): container finished" podID="150dab05-11f9-44e4-b319-8f008d203f98" containerID="e5069d4a08a4804c257398dba3117a6f58dd449dca47d68e072f4486020fe8dc" exitCode=0 Apr 24 21:20:28.802766 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:28.802398 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" event={"ID":"150dab05-11f9-44e4-b319-8f008d203f98","Type":"ContainerDied","Data":"e5069d4a08a4804c257398dba3117a6f58dd449dca47d68e072f4486020fe8dc"} Apr 24 21:20:29.923239 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:29.923216 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:29.966662 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:29.966630 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7t9s\" (UniqueName: \"kubernetes.io/projected/150dab05-11f9-44e4-b319-8f008d203f98-kube-api-access-d7t9s\") pod \"150dab05-11f9-44e4-b319-8f008d203f98\" (UID: \"150dab05-11f9-44e4-b319-8f008d203f98\") " Apr 24 21:20:29.966813 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:29.966684 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/150dab05-11f9-44e4-b319-8f008d203f98-util\") pod \"150dab05-11f9-44e4-b319-8f008d203f98\" (UID: \"150dab05-11f9-44e4-b319-8f008d203f98\") " Apr 24 21:20:29.966813 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:29.966734 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/150dab05-11f9-44e4-b319-8f008d203f98-bundle\") pod \"150dab05-11f9-44e4-b319-8f008d203f98\" (UID: \"150dab05-11f9-44e4-b319-8f008d203f98\") " Apr 24 21:20:29.967315 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:29.967282 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150dab05-11f9-44e4-b319-8f008d203f98-bundle" (OuterVolumeSpecName: "bundle") pod "150dab05-11f9-44e4-b319-8f008d203f98" (UID: "150dab05-11f9-44e4-b319-8f008d203f98"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:20:29.968893 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:29.968870 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150dab05-11f9-44e4-b319-8f008d203f98-kube-api-access-d7t9s" (OuterVolumeSpecName: "kube-api-access-d7t9s") pod "150dab05-11f9-44e4-b319-8f008d203f98" (UID: "150dab05-11f9-44e4-b319-8f008d203f98"). InnerVolumeSpecName "kube-api-access-d7t9s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:20:29.970687 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:29.970665 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150dab05-11f9-44e4-b319-8f008d203f98-util" (OuterVolumeSpecName: "util") pod "150dab05-11f9-44e4-b319-8f008d203f98" (UID: "150dab05-11f9-44e4-b319-8f008d203f98"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:20:30.068163 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:30.068043 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d7t9s\" (UniqueName: \"kubernetes.io/projected/150dab05-11f9-44e4-b319-8f008d203f98-kube-api-access-d7t9s\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:20:30.068163 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:30.068110 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/150dab05-11f9-44e4-b319-8f008d203f98-util\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:20:30.068163 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:30.068120 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/150dab05-11f9-44e4-b319-8f008d203f98-bundle\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:20:30.809826 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:30.809792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" event={"ID":"150dab05-11f9-44e4-b319-8f008d203f98","Type":"ContainerDied","Data":"963b2c5607fbbd40effd9a9ae48930371ec8d2d6cf821064d6d48a44a6a1f948"} Apr 24 21:20:30.809826 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:30.809825 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="963b2c5607fbbd40effd9a9ae48930371ec8d2d6cf821064d6d48a44a6a1f948" Apr 24 21:20:30.810031 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:30.809851 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv7vhr" Apr 24 21:20:35.719241 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.719191 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425"] Apr 24 21:20:35.719611 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.719437 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="150dab05-11f9-44e4-b319-8f008d203f98" containerName="util" Apr 24 21:20:35.719611 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.719449 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="150dab05-11f9-44e4-b319-8f008d203f98" containerName="util" Apr 24 21:20:35.719611 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.719456 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="150dab05-11f9-44e4-b319-8f008d203f98" containerName="pull" Apr 24 21:20:35.719611 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.719462 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="150dab05-11f9-44e4-b319-8f008d203f98" containerName="pull" Apr 24 21:20:35.719611 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.719474 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="150dab05-11f9-44e4-b319-8f008d203f98" containerName="extract" Apr 24 21:20:35.719611 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.719480 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="150dab05-11f9-44e4-b319-8f008d203f98" containerName="extract" Apr 24 21:20:35.719611 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.719520 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="150dab05-11f9-44e4-b319-8f008d203f98" containerName="extract" Apr 24 21:20:35.763966 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.763923 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425"] Apr 24 21:20:35.764143 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.764050 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" Apr 24 21:20:35.768206 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.768186 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:20:35.768316 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.768241 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-w8cxv\"" Apr 24 21:20:35.768674 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.768654 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:20:35.769225 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.769210 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:20:35.811789 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.811759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8rdt\" (UniqueName: \"kubernetes.io/projected/25404c61-cd96-414e-b505-f10b2532a3c6-kube-api-access-l8rdt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jt425\" (UID: \"25404c61-cd96-414e-b505-f10b2532a3c6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" Apr 24 21:20:35.811789 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.811795 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/25404c61-cd96-414e-b505-f10b2532a3c6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jt425\" (UID: \"25404c61-cd96-414e-b505-f10b2532a3c6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" Apr 24 21:20:35.912492 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.912455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8rdt\" (UniqueName: \"kubernetes.io/projected/25404c61-cd96-414e-b505-f10b2532a3c6-kube-api-access-l8rdt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jt425\" (UID: \"25404c61-cd96-414e-b505-f10b2532a3c6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" Apr 24 21:20:35.912492 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.912497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/25404c61-cd96-414e-b505-f10b2532a3c6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jt425\" (UID: \"25404c61-cd96-414e-b505-f10b2532a3c6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" Apr 24 21:20:35.914880 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.914854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/25404c61-cd96-414e-b505-f10b2532a3c6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jt425\" (UID: \"25404c61-cd96-414e-b505-f10b2532a3c6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" Apr 24 21:20:35.922085 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:35.922036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8rdt\" (UniqueName: \"kubernetes.io/projected/25404c61-cd96-414e-b505-f10b2532a3c6-kube-api-access-l8rdt\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-jt425\" (UID: \"25404c61-cd96-414e-b505-f10b2532a3c6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" Apr 24 21:20:36.074977 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:36.074943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" Apr 24 21:20:36.199912 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:36.199884 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425"] Apr 24 21:20:36.202813 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:20:36.202775 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25404c61_cd96_414e_b505_f10b2532a3c6.slice/crio-cbc1e13b58ec3bf38d69d63bde775efa4ab7fe00c6a5c2174898170d90a1f840 WatchSource:0}: Error finding container cbc1e13b58ec3bf38d69d63bde775efa4ab7fe00c6a5c2174898170d90a1f840: Status 404 returned error can't find the container with id cbc1e13b58ec3bf38d69d63bde775efa4ab7fe00c6a5c2174898170d90a1f840 Apr 24 21:20:36.825241 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:36.825195 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" event={"ID":"25404c61-cd96-414e-b505-f10b2532a3c6","Type":"ContainerStarted","Data":"cbc1e13b58ec3bf38d69d63bde775efa4ab7fe00c6a5c2174898170d90a1f840"} Apr 24 21:20:40.385846 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.385807 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lzv9l"] Apr 24 21:20:40.410780 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.410750 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lzv9l"] Apr 24 21:20:40.410945 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.410875 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:40.413404 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.413378 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:20:40.413533 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.413412 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 21:20:40.413533 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.413454 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5gv7q\"" Apr 24 21:20:40.548813 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.548774 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:40.548991 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.548835 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-cabundle0\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:40.548991 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.548927 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlddb\" (UniqueName: \"kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-kube-api-access-rlddb\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:40.649673 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.649569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlddb\" (UniqueName: \"kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-kube-api-access-rlddb\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:40.649673 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.649650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:40.649891 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.649697 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-cabundle0\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:40.649891 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:40.649812 2575 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 24 21:20:40.649891 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:40.649837 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:20:40.649891 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:40.649848 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:20:40.649891 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:40.649863 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lzv9l: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:20:40.650182 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:40.649920 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates podName:dad5dff3-99e9-471b-87f6-0a4c5e48bf67 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:41.149901066 +0000 UTC m=+239.751056620 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates") pod "keda-operator-ffbb595cb-lzv9l" (UID: "dad5dff3-99e9-471b-87f6-0a4c5e48bf67") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:20:40.650341 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.650322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-cabundle0\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:40.665154 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.665124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlddb\" (UniqueName: \"kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-kube-api-access-rlddb\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:40.740599 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.740562 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76"] Apr 24 21:20:40.765295 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.765263 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76"] Apr 24 21:20:40.765445 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.765349 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:40.767946 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.767918 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 21:20:40.842376 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.842337 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" event={"ID":"25404c61-cd96-414e-b505-f10b2532a3c6","Type":"ContainerStarted","Data":"b8298ac18d14ee511a7cf8b01137837cb5b2cecc59825df1d3ca288118173545"} Apr 24 21:20:40.842580 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.842472 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" Apr 24 21:20:40.851587 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.851559 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:40.851587 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.851591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:40.851795 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.851623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhbbw\" (UniqueName: \"kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-kube-api-access-lhbbw\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:40.870389 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.870342 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" podStartSLOduration=2.259939974 podStartE2EDuration="5.870329331s" podCreationTimestamp="2026-04-24 21:20:35 +0000 UTC" firstStartedPulling="2026-04-24 21:20:36.204570322 +0000 UTC m=+234.805725860" lastFinishedPulling="2026-04-24 21:20:39.814959675 +0000 UTC m=+238.416115217" observedRunningTime="2026-04-24 21:20:40.86866965 +0000 UTC m=+239.469825211" watchObservedRunningTime="2026-04-24 21:20:40.870329331 +0000 UTC m=+239.471484892" Apr 24 21:20:40.953122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.953005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:40.953122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.953053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:40.953122 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.953113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhbbw\" (UniqueName: \"kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-kube-api-access-lhbbw\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:40.953413 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:40.953195 2575 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:20:40.953413 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:40.953220 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:20:40.953413 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:40.953243 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76: references non-existent secret key: tls.crt Apr 24 21:20:40.953413 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:40.953310 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates podName:a1bd175e-1b53-405a-b99f-5120ffb3f1e2 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:41.453289419 +0000 UTC m=+240.054444969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates") pod "keda-metrics-apiserver-7c9f485588-ncg76" (UID: "a1bd175e-1b53-405a-b99f-5120ffb3f1e2") : references non-existent secret key: tls.crt Apr 24 21:20:40.953549 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.953534 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:40.965364 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:40.965338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhbbw\" (UniqueName: \"kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-kube-api-access-lhbbw\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:41.008415 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.008359 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-vbmhf"] Apr 24 21:20:41.034749 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.034715 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-vbmhf"] Apr 24 21:20:41.034921 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.034854 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-vbmhf" Apr 24 21:20:41.037974 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.037949 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 21:20:41.155403 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.155363 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcjkz\" (UniqueName: \"kubernetes.io/projected/fa4f9b9f-b50d-456f-bae6-9af02c8d10fb-kube-api-access-lcjkz\") pod \"keda-admission-cf49989db-vbmhf\" (UID: \"fa4f9b9f-b50d-456f-bae6-9af02c8d10fb\") " pod="openshift-keda/keda-admission-cf49989db-vbmhf" Apr 24 21:20:41.155575 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.155412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:41.155575 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.155434 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fa4f9b9f-b50d-456f-bae6-9af02c8d10fb-certificates\") pod \"keda-admission-cf49989db-vbmhf\" (UID: \"fa4f9b9f-b50d-456f-bae6-9af02c8d10fb\") " pod="openshift-keda/keda-admission-cf49989db-vbmhf" Apr 24 21:20:41.155575 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:41.155541 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:20:41.155575 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:41.155567 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:20:41.155575 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:41.155577 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lzv9l: references non-existent secret key: ca.crt Apr 24 21:20:41.155735 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:41.155627 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates podName:dad5dff3-99e9-471b-87f6-0a4c5e48bf67 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:42.1556108 +0000 UTC m=+240.756766340 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates") pod "keda-operator-ffbb595cb-lzv9l" (UID: "dad5dff3-99e9-471b-87f6-0a4c5e48bf67") : references non-existent secret key: ca.crt Apr 24 21:20:41.256295 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.256197 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcjkz\" (UniqueName: \"kubernetes.io/projected/fa4f9b9f-b50d-456f-bae6-9af02c8d10fb-kube-api-access-lcjkz\") pod \"keda-admission-cf49989db-vbmhf\" (UID: \"fa4f9b9f-b50d-456f-bae6-9af02c8d10fb\") " pod="openshift-keda/keda-admission-cf49989db-vbmhf" Apr 24 21:20:41.256295 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.256265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fa4f9b9f-b50d-456f-bae6-9af02c8d10fb-certificates\") pod \"keda-admission-cf49989db-vbmhf\" (UID: \"fa4f9b9f-b50d-456f-bae6-9af02c8d10fb\") " pod="openshift-keda/keda-admission-cf49989db-vbmhf" Apr 24 21:20:41.256518 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:41.256422 2575 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 24 21:20:41.256518 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:41.256460 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-vbmhf: secret "keda-admission-webhooks-certs" not found Apr 24 21:20:41.256629 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:41.256528 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa4f9b9f-b50d-456f-bae6-9af02c8d10fb-certificates podName:fa4f9b9f-b50d-456f-bae6-9af02c8d10fb nodeName:}" failed. No retries permitted until 2026-04-24 21:20:41.756506757 +0000 UTC m=+240.357662311 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fa4f9b9f-b50d-456f-bae6-9af02c8d10fb-certificates") pod "keda-admission-cf49989db-vbmhf" (UID: "fa4f9b9f-b50d-456f-bae6-9af02c8d10fb") : secret "keda-admission-webhooks-certs" not found Apr 24 21:20:41.267485 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.267450 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcjkz\" (UniqueName: \"kubernetes.io/projected/fa4f9b9f-b50d-456f-bae6-9af02c8d10fb-kube-api-access-lcjkz\") pod \"keda-admission-cf49989db-vbmhf\" (UID: \"fa4f9b9f-b50d-456f-bae6-9af02c8d10fb\") " pod="openshift-keda/keda-admission-cf49989db-vbmhf" Apr 24 21:20:41.458887 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.458256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:41.458887 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:41.458421 2575 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:20:41.458887 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:41.458448 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:20:41.458887 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:41.458472 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76: references non-existent secret key: tls.crt Apr 24 21:20:41.458887 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:41.458532 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates podName:a1bd175e-1b53-405a-b99f-5120ffb3f1e2 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:42.458513866 +0000 UTC m=+241.059669420 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates") pod "keda-metrics-apiserver-7c9f485588-ncg76" (UID: "a1bd175e-1b53-405a-b99f-5120ffb3f1e2") : references non-existent secret key: tls.crt Apr 24 21:20:41.761531 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.761494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fa4f9b9f-b50d-456f-bae6-9af02c8d10fb-certificates\") pod \"keda-admission-cf49989db-vbmhf\" (UID: \"fa4f9b9f-b50d-456f-bae6-9af02c8d10fb\") " pod="openshift-keda/keda-admission-cf49989db-vbmhf" Apr 24 21:20:41.764043 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.764005 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fa4f9b9f-b50d-456f-bae6-9af02c8d10fb-certificates\") pod \"keda-admission-cf49989db-vbmhf\" (UID: \"fa4f9b9f-b50d-456f-bae6-9af02c8d10fb\") " pod="openshift-keda/keda-admission-cf49989db-vbmhf" Apr 24 21:20:41.947412 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.947375 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5gv7q\"" Apr 24 21:20:41.955440 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:41.955416 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-vbmhf" Apr 24 21:20:42.080240 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:42.080208 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-vbmhf"] Apr 24 21:20:42.084258 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:20:42.084228 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa4f9b9f_b50d_456f_bae6_9af02c8d10fb.slice/crio-1086f37a34621c09d01e8c7a931e556aed1a9a33a7509ae192021393236b8e3b WatchSource:0}: Error finding container 1086f37a34621c09d01e8c7a931e556aed1a9a33a7509ae192021393236b8e3b: Status 404 returned error can't find the container with id 1086f37a34621c09d01e8c7a931e556aed1a9a33a7509ae192021393236b8e3b Apr 24 21:20:42.165537 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:42.165500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:42.165706 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:42.165682 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:20:42.165763 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:42.165713 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:20:42.165763 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:42.165727 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lzv9l: references non-existent secret key: ca.crt Apr 24 21:20:42.165844 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:42.165795 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates podName:dad5dff3-99e9-471b-87f6-0a4c5e48bf67 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:44.165773595 +0000 UTC m=+242.766929137 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates") pod "keda-operator-ffbb595cb-lzv9l" (UID: "dad5dff3-99e9-471b-87f6-0a4c5e48bf67") : references non-existent secret key: ca.crt Apr 24 21:20:42.468309 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:42.468271 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:42.468677 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:42.468406 2575 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:20:42.468677 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:42.468418 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:20:42.468677 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:42.468436 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76: references non-existent secret key: tls.crt Apr 24 21:20:42.468677 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:42.468485 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates podName:a1bd175e-1b53-405a-b99f-5120ffb3f1e2 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:44.468471658 +0000 UTC m=+243.069627202 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates") pod "keda-metrics-apiserver-7c9f485588-ncg76" (UID: "a1bd175e-1b53-405a-b99f-5120ffb3f1e2") : references non-existent secret key: tls.crt Apr 24 21:20:42.848335 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:42.848299 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-vbmhf" event={"ID":"fa4f9b9f-b50d-456f-bae6-9af02c8d10fb","Type":"ContainerStarted","Data":"1086f37a34621c09d01e8c7a931e556aed1a9a33a7509ae192021393236b8e3b"} Apr 24 21:20:44.182891 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:44.182850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:44.183318 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:44.182988 2575 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:20:44.183318 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:44.183002 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:20:44.183318 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:44.183012 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lzv9l: references non-existent secret key: ca.crt Apr 24 21:20:44.183318 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:44.183084 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates podName:dad5dff3-99e9-471b-87f6-0a4c5e48bf67 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:48.183048816 +0000 UTC m=+246.784204359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates") pod "keda-operator-ffbb595cb-lzv9l" (UID: "dad5dff3-99e9-471b-87f6-0a4c5e48bf67") : references non-existent secret key: ca.crt Apr 24 21:20:44.485977 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:44.485891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:44.486142 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:44.486004 2575 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:20:44.486142 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:44.486016 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:20:44.486142 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:44.486032 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76: references non-existent secret key: tls.crt Apr 24 21:20:44.486142 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:20:44.486110 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates podName:a1bd175e-1b53-405a-b99f-5120ffb3f1e2 nodeName:}" failed. No retries permitted until 2026-04-24 21:20:48.486095962 +0000 UTC m=+247.087251507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates") pod "keda-metrics-apiserver-7c9f485588-ncg76" (UID: "a1bd175e-1b53-405a-b99f-5120ffb3f1e2") : references non-existent secret key: tls.crt Apr 24 21:20:45.859954 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:45.859918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-vbmhf" event={"ID":"fa4f9b9f-b50d-456f-bae6-9af02c8d10fb","Type":"ContainerStarted","Data":"17b9fc2d9ae35d0f5965949d9e83724306996bcce0b6b77b49cb1bea9356b22b"} Apr 24 21:20:45.860386 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:45.860042 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-vbmhf" Apr 24 21:20:45.877377 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:45.877319 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-vbmhf" podStartSLOduration=2.811054729 podStartE2EDuration="5.877303865s" podCreationTimestamp="2026-04-24 21:20:40 +0000 UTC" firstStartedPulling="2026-04-24 21:20:42.086127352 +0000 UTC m=+240.687282891" lastFinishedPulling="2026-04-24 21:20:45.152376481 +0000 UTC m=+243.753532027" observedRunningTime="2026-04-24 21:20:45.875825498 +0000 UTC m=+244.476981059" watchObservedRunningTime="2026-04-24 21:20:45.877303865 +0000 UTC m=+244.478459426" Apr 24 21:20:48.216424 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:48.216388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:48.218839 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:48.218812 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dad5dff3-99e9-471b-87f6-0a4c5e48bf67-certificates\") pod \"keda-operator-ffbb595cb-lzv9l\" (UID: \"dad5dff3-99e9-471b-87f6-0a4c5e48bf67\") " pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:48.220726 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:48.220707 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:48.360806 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:48.360771 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lzv9l"] Apr 24 21:20:48.363840 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:20:48.363804 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddad5dff3_99e9_471b_87f6_0a4c5e48bf67.slice/crio-cf831ce3e168f9db50c3b35e15f341df675989511347e1cdb48fc9cbcbea9c29 WatchSource:0}: Error finding container cf831ce3e168f9db50c3b35e15f341df675989511347e1cdb48fc9cbcbea9c29: Status 404 returned error can't find the container with id cf831ce3e168f9db50c3b35e15f341df675989511347e1cdb48fc9cbcbea9c29 Apr 24 21:20:48.519742 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:48.519648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:48.522150 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:48.522129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a1bd175e-1b53-405a-b99f-5120ffb3f1e2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ncg76\" (UID: \"a1bd175e-1b53-405a-b99f-5120ffb3f1e2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:48.575180 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:48.575128 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:48.695479 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:20:48.695430 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1bd175e_1b53_405a_b99f_5120ffb3f1e2.slice/crio-f54838bb2a32a710a4a7fa847e3f3a3c5f12b9853aa60112905b23534f52ab4b WatchSource:0}: Error finding container f54838bb2a32a710a4a7fa847e3f3a3c5f12b9853aa60112905b23534f52ab4b: Status 404 returned error can't find the container with id f54838bb2a32a710a4a7fa847e3f3a3c5f12b9853aa60112905b23534f52ab4b Apr 24 21:20:48.696688 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:48.696659 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76"] Apr 24 21:20:48.868329 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:48.868290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" event={"ID":"a1bd175e-1b53-405a-b99f-5120ffb3f1e2","Type":"ContainerStarted","Data":"f54838bb2a32a710a4a7fa847e3f3a3c5f12b9853aa60112905b23534f52ab4b"} Apr 24 21:20:48.869314 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:48.869290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" event={"ID":"dad5dff3-99e9-471b-87f6-0a4c5e48bf67","Type":"ContainerStarted","Data":"cf831ce3e168f9db50c3b35e15f341df675989511347e1cdb48fc9cbcbea9c29"} Apr 24 21:20:52.881745 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:52.881707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" event={"ID":"a1bd175e-1b53-405a-b99f-5120ffb3f1e2","Type":"ContainerStarted","Data":"7ca74e2e7a21824ab2768ac34c0ff66fab148abded2a1a88f0684e1fe5ecc5c4"} Apr 24 21:20:52.882178 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:52.881926 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:20:52.883045 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:52.883028 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" event={"ID":"dad5dff3-99e9-471b-87f6-0a4c5e48bf67","Type":"ContainerStarted","Data":"836edbbabb9746aa8d3bbadcc6eb1f58ff4ba8b5e309953c80e5fca016fd81d0"} Apr 24 21:20:52.883117 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:52.883104 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:20:52.900788 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:52.900736 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" podStartSLOduration=9.39094571 podStartE2EDuration="12.90072242s" podCreationTimestamp="2026-04-24 21:20:40 +0000 UTC" firstStartedPulling="2026-04-24 21:20:48.697095499 +0000 UTC m=+247.298251042" lastFinishedPulling="2026-04-24 21:20:52.206872206 +0000 UTC m=+250.808027752" observedRunningTime="2026-04-24 21:20:52.899382568 +0000 UTC m=+251.500538140" watchObservedRunningTime="2026-04-24 21:20:52.90072242 +0000 UTC m=+251.501878287" Apr 24 21:20:52.923407 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:20:52.923341 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" podStartSLOduration=9.076382018 podStartE2EDuration="12.923326553s" podCreationTimestamp="2026-04-24 21:20:40 +0000 UTC" firstStartedPulling="2026-04-24 21:20:48.365224706 +0000 UTC m=+246.966380245" lastFinishedPulling="2026-04-24 21:20:52.212169237 +0000 UTC m=+250.813324780" observedRunningTime="2026-04-24 21:20:52.920679392 +0000 UTC m=+251.521834954" watchObservedRunningTime="2026-04-24 21:20:52.923326553 +0000 UTC m=+251.524482490" Apr 24 21:21:01.847399 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:01.847371 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-jt425" Apr 24 21:21:03.890017 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:03.889984 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ncg76" Apr 24 21:21:06.864769 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:06.864737 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-vbmhf" Apr 24 21:21:13.887994 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:13.887964 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-lzv9l" Apr 24 21:21:41.886826 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:41.886798 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:21:41.887423 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:41.886799 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:21:41.890325 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:41.890304 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:21:48.145633 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.145600 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-ngx4p"] Apr 24 21:21:48.149111 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.149093 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" Apr 24 21:21:48.152643 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.152616 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:21:48.152784 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.152683 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:21:48.152852 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.152841 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:21:48.153653 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.153634 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-8vt8t\"" Apr 24 21:21:48.157906 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.157884 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv"] Apr 24 21:21:48.160833 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.160816 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" Apr 24 21:21:48.162428 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.162408 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-ngx4p"] Apr 24 21:21:48.163999 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.163970 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:21:48.164111 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.164023 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-pjjdx\"" Apr 24 21:21:48.177869 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.177812 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv"] Apr 24 21:21:48.190134 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.190103 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-dznv7"] Apr 24 21:21:48.193679 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.193661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-dznv7" Apr 24 21:21:48.196089 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.196054 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-ljxsr\"" Apr 24 21:21:48.196219 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.196195 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:21:48.201979 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.201954 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-dznv7"] Apr 24 21:21:48.250561 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.250520 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6214ebb1-ab50-4683-8130-dacaf05d43fd-cert\") pod \"kserve-controller-manager-74fc8f6f96-ngx4p\" (UID: \"6214ebb1-ab50-4683-8130-dacaf05d43fd\") " pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" Apr 24 21:21:48.250561 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.250560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f23d65a8-d1c8-46a5-9de4-c141395162b0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ngqtv\" (UID: \"f23d65a8-d1c8-46a5-9de4-c141395162b0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" Apr 24 21:21:48.250769 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.250580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0-data\") pod \"seaweedfs-86cc847c5c-dznv7\" (UID: \"23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0\") " pod="kserve/seaweedfs-86cc847c5c-dznv7" Apr 24 21:21:48.250769 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.250609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbvgn\" (UniqueName: \"kubernetes.io/projected/6214ebb1-ab50-4683-8130-dacaf05d43fd-kube-api-access-xbvgn\") pod \"kserve-controller-manager-74fc8f6f96-ngx4p\" (UID: \"6214ebb1-ab50-4683-8130-dacaf05d43fd\") " pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" Apr 24 21:21:48.250769 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.250640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngmc9\" (UniqueName: \"kubernetes.io/projected/23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0-kube-api-access-ngmc9\") pod \"seaweedfs-86cc847c5c-dznv7\" (UID: \"23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0\") " pod="kserve/seaweedfs-86cc847c5c-dznv7" Apr 24 21:21:48.250769 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.250669 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb7v4\" (UniqueName: \"kubernetes.io/projected/f23d65a8-d1c8-46a5-9de4-c141395162b0-kube-api-access-lb7v4\") pod \"llmisvc-controller-manager-68cc5db7c4-ngqtv\" (UID: \"f23d65a8-d1c8-46a5-9de4-c141395162b0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" Apr 24 21:21:48.351912 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.351879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lb7v4\" (UniqueName: \"kubernetes.io/projected/f23d65a8-d1c8-46a5-9de4-c141395162b0-kube-api-access-lb7v4\") pod \"llmisvc-controller-manager-68cc5db7c4-ngqtv\" (UID: \"f23d65a8-d1c8-46a5-9de4-c141395162b0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" Apr 24 21:21:48.352127 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.351964 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6214ebb1-ab50-4683-8130-dacaf05d43fd-cert\") pod \"kserve-controller-manager-74fc8f6f96-ngx4p\" (UID: \"6214ebb1-ab50-4683-8130-dacaf05d43fd\") " pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" Apr 24 21:21:48.352127 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.351987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f23d65a8-d1c8-46a5-9de4-c141395162b0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ngqtv\" (UID: \"f23d65a8-d1c8-46a5-9de4-c141395162b0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" Apr 24 21:21:48.352127 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.352004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0-data\") pod \"seaweedfs-86cc847c5c-dznv7\" (UID: \"23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0\") " pod="kserve/seaweedfs-86cc847c5c-dznv7" Apr 24 21:21:48.352127 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.352029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbvgn\" (UniqueName: \"kubernetes.io/projected/6214ebb1-ab50-4683-8130-dacaf05d43fd-kube-api-access-xbvgn\") pod \"kserve-controller-manager-74fc8f6f96-ngx4p\" (UID: \"6214ebb1-ab50-4683-8130-dacaf05d43fd\") " pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" Apr 24 21:21:48.352127 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.352058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngmc9\" (UniqueName: \"kubernetes.io/projected/23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0-kube-api-access-ngmc9\") pod \"seaweedfs-86cc847c5c-dznv7\" (UID: \"23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0\") " pod="kserve/seaweedfs-86cc847c5c-dznv7" Apr 24 21:21:48.352536 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.352515 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0-data\") pod \"seaweedfs-86cc847c5c-dznv7\" (UID: \"23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0\") " pod="kserve/seaweedfs-86cc847c5c-dznv7" Apr 24 21:21:48.354615 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.354595 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f23d65a8-d1c8-46a5-9de4-c141395162b0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-ngqtv\" (UID: \"f23d65a8-d1c8-46a5-9de4-c141395162b0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" Apr 24 21:21:48.354713 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.354638 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6214ebb1-ab50-4683-8130-dacaf05d43fd-cert\") pod \"kserve-controller-manager-74fc8f6f96-ngx4p\" (UID: \"6214ebb1-ab50-4683-8130-dacaf05d43fd\") " pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" Apr 24 21:21:48.361173 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.361130 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb7v4\" (UniqueName: \"kubernetes.io/projected/f23d65a8-d1c8-46a5-9de4-c141395162b0-kube-api-access-lb7v4\") pod \"llmisvc-controller-manager-68cc5db7c4-ngqtv\" (UID: \"f23d65a8-d1c8-46a5-9de4-c141395162b0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" Apr 24 21:21:48.361312 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.361249 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngmc9\" (UniqueName: \"kubernetes.io/projected/23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0-kube-api-access-ngmc9\") pod \"seaweedfs-86cc847c5c-dznv7\" (UID: \"23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0\") " pod="kserve/seaweedfs-86cc847c5c-dznv7" Apr 24 21:21:48.361312 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.361258 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbvgn\" (UniqueName: \"kubernetes.io/projected/6214ebb1-ab50-4683-8130-dacaf05d43fd-kube-api-access-xbvgn\") pod \"kserve-controller-manager-74fc8f6f96-ngx4p\" (UID: \"6214ebb1-ab50-4683-8130-dacaf05d43fd\") " pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" Apr 24 21:21:48.459347 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.459239 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" Apr 24 21:21:48.471077 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.471031 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" Apr 24 21:21:48.504487 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.504454 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-dznv7" Apr 24 21:21:48.601852 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.601819 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-ngx4p"] Apr 24 21:21:48.605953 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:21:48.605917 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6214ebb1_ab50_4683_8130_dacaf05d43fd.slice/crio-f8e6183d0d6c188ba366ffdbb95bdf722e0f37e4a615811fadb4defbe5562155 WatchSource:0}: Error finding container f8e6183d0d6c188ba366ffdbb95bdf722e0f37e4a615811fadb4defbe5562155: Status 404 returned error can't find the container with id f8e6183d0d6c188ba366ffdbb95bdf722e0f37e4a615811fadb4defbe5562155 Apr 24 21:21:48.607131 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.607108 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:21:48.651569 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.651461 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv"] Apr 24 21:21:48.654007 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:21:48.653972 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf23d65a8_d1c8_46a5_9de4_c141395162b0.slice/crio-b90e8e4d1e4a11a7ba3c5b04742346ef9c0d6592bee52dfcbd1c1270d046a8ca WatchSource:0}: Error finding container b90e8e4d1e4a11a7ba3c5b04742346ef9c0d6592bee52dfcbd1c1270d046a8ca: Status 404 returned error can't find the container with id b90e8e4d1e4a11a7ba3c5b04742346ef9c0d6592bee52dfcbd1c1270d046a8ca Apr 24 21:21:48.672461 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:48.672428 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-dznv7"] Apr 24 21:21:48.675014 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:21:48.674984 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23a5d406_2cd1_44fb_81d8_8bb8c63f5ea0.slice/crio-5b090ef571ba5b1eed7f1a5cb7e8db28c03b464a04c0769b5c403ec35da8f9d8 WatchSource:0}: Error finding container 5b090ef571ba5b1eed7f1a5cb7e8db28c03b464a04c0769b5c403ec35da8f9d8: Status 404 returned error can't find the container with id 5b090ef571ba5b1eed7f1a5cb7e8db28c03b464a04c0769b5c403ec35da8f9d8 Apr 24 21:21:49.043764 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:49.043726 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" event={"ID":"f23d65a8-d1c8-46a5-9de4-c141395162b0","Type":"ContainerStarted","Data":"b90e8e4d1e4a11a7ba3c5b04742346ef9c0d6592bee52dfcbd1c1270d046a8ca"} Apr 24 21:21:49.044734 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:49.044705 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-dznv7" event={"ID":"23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0","Type":"ContainerStarted","Data":"5b090ef571ba5b1eed7f1a5cb7e8db28c03b464a04c0769b5c403ec35da8f9d8"} Apr 24 21:21:49.045652 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:49.045632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" event={"ID":"6214ebb1-ab50-4683-8130-dacaf05d43fd","Type":"ContainerStarted","Data":"f8e6183d0d6c188ba366ffdbb95bdf722e0f37e4a615811fadb4defbe5562155"} Apr 24 21:21:54.070591 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:54.070555 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" event={"ID":"f23d65a8-d1c8-46a5-9de4-c141395162b0","Type":"ContainerStarted","Data":"ca518b15682f6a97b498ee2f900af39506eddc202248b0990f16562a0955d39a"} Apr 24 21:21:54.071039 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:54.070654 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" Apr 24 21:21:54.071904 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:54.071882 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-dznv7" event={"ID":"23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0","Type":"ContainerStarted","Data":"b75e952e911a49e2e93c1984e89a01ec28c5dee5a7cf6095e914fd325d553a36"} Apr 24 21:21:54.071998 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:54.071949 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-dznv7" Apr 24 21:21:54.073111 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:54.073090 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" event={"ID":"6214ebb1-ab50-4683-8130-dacaf05d43fd","Type":"ContainerStarted","Data":"2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963"} Apr 24 21:21:54.073220 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:54.073206 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" Apr 24 21:21:54.090546 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:54.090499 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" podStartSLOduration=1.537179402 podStartE2EDuration="6.090484741s" podCreationTimestamp="2026-04-24 21:21:48 +0000 UTC" firstStartedPulling="2026-04-24 21:21:48.655394181 +0000 UTC m=+307.256549720" lastFinishedPulling="2026-04-24 21:21:53.208699515 +0000 UTC m=+311.809855059" observedRunningTime="2026-04-24 21:21:54.088172414 +0000 UTC m=+312.689327989" watchObservedRunningTime="2026-04-24 21:21:54.090484741 +0000 UTC m=+312.691640351" Apr 24 21:21:54.103528 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:54.103474 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-dznv7" podStartSLOduration=1.516063971 podStartE2EDuration="6.103455358s" podCreationTimestamp="2026-04-24 21:21:48 +0000 UTC" firstStartedPulling="2026-04-24 21:21:48.676336219 +0000 UTC m=+307.277491757" lastFinishedPulling="2026-04-24 21:21:53.263727605 +0000 UTC m=+311.864883144" observedRunningTime="2026-04-24 21:21:54.10271396 +0000 UTC m=+312.703869522" watchObservedRunningTime="2026-04-24 21:21:54.103455358 +0000 UTC m=+312.704610920" Apr 24 21:21:54.116989 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:21:54.116933 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" podStartSLOduration=1.5958875319999999 podStartE2EDuration="6.11691594s" podCreationTimestamp="2026-04-24 21:21:48 +0000 UTC" firstStartedPulling="2026-04-24 21:21:48.607276161 +0000 UTC m=+307.208431702" lastFinishedPulling="2026-04-24 21:21:53.128304567 +0000 UTC m=+311.729460110" observedRunningTime="2026-04-24 21:21:54.116099233 +0000 UTC m=+312.717254793" watchObservedRunningTime="2026-04-24 21:21:54.11691594 +0000 UTC m=+312.718071501" Apr 24 21:22:00.078314 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:00.078283 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-dznv7" Apr 24 21:22:25.078679 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:25.078592 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-ngqtv" Apr 24 21:22:25.081523 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:25.081503 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" Apr 24 21:22:26.327733 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.327695 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-ngx4p"] Apr 24 21:22:26.328175 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.327944 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" podUID="6214ebb1-ab50-4683-8130-dacaf05d43fd" containerName="manager" containerID="cri-o://2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963" gracePeriod=10 Apr 24 21:22:26.347452 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.347424 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-wpmzw"] Apr 24 21:22:26.350979 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.350962 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" Apr 24 21:22:26.358112 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.358087 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-wpmzw"] Apr 24 21:22:26.369917 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.369881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rndt\" (UniqueName: \"kubernetes.io/projected/8a492e9d-76bb-4b11-a920-f3cad9b5915d-kube-api-access-6rndt\") pod \"kserve-controller-manager-74fc8f6f96-wpmzw\" (UID: \"8a492e9d-76bb-4b11-a920-f3cad9b5915d\") " pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" Apr 24 21:22:26.370078 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.369975 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a492e9d-76bb-4b11-a920-f3cad9b5915d-cert\") pod \"kserve-controller-manager-74fc8f6f96-wpmzw\" (UID: \"8a492e9d-76bb-4b11-a920-f3cad9b5915d\") " pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" Apr 24 21:22:26.470391 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.470357 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rndt\" (UniqueName: \"kubernetes.io/projected/8a492e9d-76bb-4b11-a920-f3cad9b5915d-kube-api-access-6rndt\") pod \"kserve-controller-manager-74fc8f6f96-wpmzw\" (UID: \"8a492e9d-76bb-4b11-a920-f3cad9b5915d\") " pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" Apr 24 21:22:26.470546 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.470417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a492e9d-76bb-4b11-a920-f3cad9b5915d-cert\") pod \"kserve-controller-manager-74fc8f6f96-wpmzw\" (UID: \"8a492e9d-76bb-4b11-a920-f3cad9b5915d\") " pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" Apr 24 21:22:26.472932 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.472877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a492e9d-76bb-4b11-a920-f3cad9b5915d-cert\") pod \"kserve-controller-manager-74fc8f6f96-wpmzw\" (UID: \"8a492e9d-76bb-4b11-a920-f3cad9b5915d\") " pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" Apr 24 21:22:26.478433 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.478404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rndt\" (UniqueName: \"kubernetes.io/projected/8a492e9d-76bb-4b11-a920-f3cad9b5915d-kube-api-access-6rndt\") pod \"kserve-controller-manager-74fc8f6f96-wpmzw\" (UID: \"8a492e9d-76bb-4b11-a920-f3cad9b5915d\") " pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" Apr 24 21:22:26.562298 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.562272 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" Apr 24 21:22:26.571002 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.570978 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbvgn\" (UniqueName: \"kubernetes.io/projected/6214ebb1-ab50-4683-8130-dacaf05d43fd-kube-api-access-xbvgn\") pod \"6214ebb1-ab50-4683-8130-dacaf05d43fd\" (UID: \"6214ebb1-ab50-4683-8130-dacaf05d43fd\") " Apr 24 21:22:26.571148 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.571020 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6214ebb1-ab50-4683-8130-dacaf05d43fd-cert\") pod \"6214ebb1-ab50-4683-8130-dacaf05d43fd\" (UID: \"6214ebb1-ab50-4683-8130-dacaf05d43fd\") " Apr 24 21:22:26.573143 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.573117 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6214ebb1-ab50-4683-8130-dacaf05d43fd-cert" (OuterVolumeSpecName: "cert") pod "6214ebb1-ab50-4683-8130-dacaf05d43fd" (UID: "6214ebb1-ab50-4683-8130-dacaf05d43fd"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:22:26.573232 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.573185 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6214ebb1-ab50-4683-8130-dacaf05d43fd-kube-api-access-xbvgn" (OuterVolumeSpecName: "kube-api-access-xbvgn") pod "6214ebb1-ab50-4683-8130-dacaf05d43fd" (UID: "6214ebb1-ab50-4683-8130-dacaf05d43fd"). InnerVolumeSpecName "kube-api-access-xbvgn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:22:26.672550 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.672453 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xbvgn\" (UniqueName: \"kubernetes.io/projected/6214ebb1-ab50-4683-8130-dacaf05d43fd-kube-api-access-xbvgn\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:22:26.672550 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.672492 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6214ebb1-ab50-4683-8130-dacaf05d43fd-cert\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:22:26.711403 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.711371 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" Apr 24 21:22:26.824441 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:26.824413 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-wpmzw"] Apr 24 21:22:26.827710 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:22:26.827683 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a492e9d_76bb_4b11_a920_f3cad9b5915d.slice/crio-8749d67aa16b3c97b0fbb0eea2c1aaea40ad3014a678510b59a0e7d734862057 WatchSource:0}: Error finding container 8749d67aa16b3c97b0fbb0eea2c1aaea40ad3014a678510b59a0e7d734862057: Status 404 returned error can't find the container with id 8749d67aa16b3c97b0fbb0eea2c1aaea40ad3014a678510b59a0e7d734862057 Apr 24 21:22:27.170749 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.170670 2575 generic.go:358] "Generic (PLEG): container finished" podID="6214ebb1-ab50-4683-8130-dacaf05d43fd" containerID="2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963" exitCode=0 Apr 24 21:22:27.170749 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.170729 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" Apr 24 21:22:27.170948 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.170722 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" event={"ID":"6214ebb1-ab50-4683-8130-dacaf05d43fd","Type":"ContainerDied","Data":"2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963"} Apr 24 21:22:27.170948 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.170878 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-ngx4p" event={"ID":"6214ebb1-ab50-4683-8130-dacaf05d43fd","Type":"ContainerDied","Data":"f8e6183d0d6c188ba366ffdbb95bdf722e0f37e4a615811fadb4defbe5562155"} Apr 24 21:22:27.170948 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.170904 2575 scope.go:117] "RemoveContainer" containerID="2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963" Apr 24 21:22:27.172288 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.172249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" event={"ID":"8a492e9d-76bb-4b11-a920-f3cad9b5915d","Type":"ContainerStarted","Data":"6b1d5fd2dc5d0a47296524f3ccb3f2db158e075aa6ccec60cb4cce20cb0aaf75"} Apr 24 21:22:27.172288 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.172280 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" event={"ID":"8a492e9d-76bb-4b11-a920-f3cad9b5915d","Type":"ContainerStarted","Data":"8749d67aa16b3c97b0fbb0eea2c1aaea40ad3014a678510b59a0e7d734862057"} Apr 24 21:22:27.172466 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.172429 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" Apr 24 21:22:27.180503 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.180445 2575 scope.go:117] "RemoveContainer" containerID="2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963" Apr 24 21:22:27.180830 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:22:27.180795 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963\": container with ID starting with 2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963 not found: ID does not exist" containerID="2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963" Apr 24 21:22:27.180948 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.180837 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963"} err="failed to get container status \"2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963\": rpc error: code = NotFound desc = could not find container \"2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963\": container with ID starting with 2590f9f38c6780cad59a135aa34fdca85cec5ea1b714abf9fc2598acb1f8d963 not found: ID does not exist" Apr 24 21:22:27.188508 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.188458 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" podStartSLOduration=0.905465665 podStartE2EDuration="1.188443486s" podCreationTimestamp="2026-04-24 21:22:26 +0000 UTC" firstStartedPulling="2026-04-24 21:22:26.828899117 +0000 UTC m=+345.430054656" lastFinishedPulling="2026-04-24 21:22:27.111876938 +0000 UTC m=+345.713032477" observedRunningTime="2026-04-24 21:22:27.186362162 +0000 UTC m=+345.787517720" watchObservedRunningTime="2026-04-24 21:22:27.188443486 +0000 UTC m=+345.789599075" Apr 24 21:22:27.196339 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.196315 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-ngx4p"] Apr 24 21:22:27.197614 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:27.197590 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-74fc8f6f96-ngx4p"] Apr 24 21:22:28.006196 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:28.006162 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6214ebb1-ab50-4683-8130-dacaf05d43fd" path="/var/lib/kubelet/pods/6214ebb1-ab50-4683-8130-dacaf05d43fd/volumes" Apr 24 21:22:58.181331 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:58.181306 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-74fc8f6f96-wpmzw" Apr 24 21:22:58.995677 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:58.995644 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-d727s"] Apr 24 21:22:58.996005 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:58.995989 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6214ebb1-ab50-4683-8130-dacaf05d43fd" containerName="manager" Apr 24 21:22:58.996099 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:58.996008 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6214ebb1-ab50-4683-8130-dacaf05d43fd" containerName="manager" Apr 24 21:22:58.996158 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:58.996129 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6214ebb1-ab50-4683-8130-dacaf05d43fd" containerName="manager" Apr 24 21:22:58.999030 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:58.999009 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-d727s" Apr 24 21:22:59.003053 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:59.003032 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-2tmdh\"" Apr 24 21:22:59.003191 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:59.003050 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:22:59.012001 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:59.011977 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-d727s"] Apr 24 21:22:59.119149 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:59.119115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6xw4\" (UniqueName: \"kubernetes.io/projected/3433e7de-2e72-444e-86cc-d175dfdf9fad-kube-api-access-g6xw4\") pod \"model-serving-api-86f7b4b499-d727s\" (UID: \"3433e7de-2e72-444e-86cc-d175dfdf9fad\") " pod="kserve/model-serving-api-86f7b4b499-d727s" Apr 24 21:22:59.119328 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:59.119164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3433e7de-2e72-444e-86cc-d175dfdf9fad-tls-certs\") pod \"model-serving-api-86f7b4b499-d727s\" (UID: \"3433e7de-2e72-444e-86cc-d175dfdf9fad\") " pod="kserve/model-serving-api-86f7b4b499-d727s" Apr 24 21:22:59.219809 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:59.219775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6xw4\" (UniqueName: \"kubernetes.io/projected/3433e7de-2e72-444e-86cc-d175dfdf9fad-kube-api-access-g6xw4\") pod \"model-serving-api-86f7b4b499-d727s\" (UID: \"3433e7de-2e72-444e-86cc-d175dfdf9fad\") " pod="kserve/model-serving-api-86f7b4b499-d727s" Apr 24 21:22:59.220210 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:59.219823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3433e7de-2e72-444e-86cc-d175dfdf9fad-tls-certs\") pod \"model-serving-api-86f7b4b499-d727s\" (UID: \"3433e7de-2e72-444e-86cc-d175dfdf9fad\") " pod="kserve/model-serving-api-86f7b4b499-d727s" Apr 24 21:22:59.222293 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:59.222273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3433e7de-2e72-444e-86cc-d175dfdf9fad-tls-certs\") pod \"model-serving-api-86f7b4b499-d727s\" (UID: \"3433e7de-2e72-444e-86cc-d175dfdf9fad\") " pod="kserve/model-serving-api-86f7b4b499-d727s" Apr 24 21:22:59.230864 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:59.230839 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6xw4\" (UniqueName: \"kubernetes.io/projected/3433e7de-2e72-444e-86cc-d175dfdf9fad-kube-api-access-g6xw4\") pod \"model-serving-api-86f7b4b499-d727s\" (UID: \"3433e7de-2e72-444e-86cc-d175dfdf9fad\") " pod="kserve/model-serving-api-86f7b4b499-d727s" Apr 24 21:22:59.308718 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:59.308688 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-d727s" Apr 24 21:22:59.429476 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:22:59.429438 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-d727s"] Apr 24 21:22:59.433401 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:22:59.433369 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3433e7de_2e72_444e_86cc_d175dfdf9fad.slice/crio-74e93d8d1c29fef95f3e2a5f763a6736bae30cc30c8523e53615f8ca9dc86c84 WatchSource:0}: Error finding container 74e93d8d1c29fef95f3e2a5f763a6736bae30cc30c8523e53615f8ca9dc86c84: Status 404 returned error can't find the container with id 74e93d8d1c29fef95f3e2a5f763a6736bae30cc30c8523e53615f8ca9dc86c84 Apr 24 21:23:00.276346 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:00.276303 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-d727s" event={"ID":"3433e7de-2e72-444e-86cc-d175dfdf9fad","Type":"ContainerStarted","Data":"74e93d8d1c29fef95f3e2a5f763a6736bae30cc30c8523e53615f8ca9dc86c84"} Apr 24 21:23:01.280894 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:01.280793 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-d727s" event={"ID":"3433e7de-2e72-444e-86cc-d175dfdf9fad","Type":"ContainerStarted","Data":"2cce4ec8ed311c164910ea60a400d09885964742938ff7ae488370ecf0a730fd"} Apr 24 21:23:01.280894 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:01.280843 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-d727s" Apr 24 21:23:01.299577 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:01.299526 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-d727s" podStartSLOduration=1.7105834149999999 podStartE2EDuration="3.299511381s" podCreationTimestamp="2026-04-24 21:22:58 +0000 UTC" firstStartedPulling="2026-04-24 21:22:59.435003219 +0000 UTC m=+378.036158762" lastFinishedPulling="2026-04-24 21:23:01.023931189 +0000 UTC m=+379.625086728" observedRunningTime="2026-04-24 21:23:01.298417831 +0000 UTC m=+379.899573392" watchObservedRunningTime="2026-04-24 21:23:01.299511381 +0000 UTC m=+379.900666941" Apr 24 21:23:05.732589 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.732560 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-764679f488-nl8lb"] Apr 24 21:23:05.735771 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.735755 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.752581 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.752556 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764679f488-nl8lb"] Apr 24 21:23:05.873887 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.873847 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e78835ff-aef3-4310-90c9-c3379838d103-console-config\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.874116 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.873910 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e78835ff-aef3-4310-90c9-c3379838d103-service-ca\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.874116 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.873951 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e78835ff-aef3-4310-90c9-c3379838d103-console-serving-cert\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.874116 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.873984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e78835ff-aef3-4310-90c9-c3379838d103-console-oauth-config\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.874116 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.874019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e78835ff-aef3-4310-90c9-c3379838d103-oauth-serving-cert\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.874116 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.874055 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e78835ff-aef3-4310-90c9-c3379838d103-trusted-ca-bundle\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.874323 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.874141 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpt4b\" (UniqueName: \"kubernetes.io/projected/e78835ff-aef3-4310-90c9-c3379838d103-kube-api-access-lpt4b\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.974773 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.974737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e78835ff-aef3-4310-90c9-c3379838d103-console-serving-cert\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.974773 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.974790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e78835ff-aef3-4310-90c9-c3379838d103-console-oauth-config\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.975054 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.974827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e78835ff-aef3-4310-90c9-c3379838d103-oauth-serving-cert\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.975054 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.974859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e78835ff-aef3-4310-90c9-c3379838d103-trusted-ca-bundle\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.975054 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.974901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpt4b\" (UniqueName: \"kubernetes.io/projected/e78835ff-aef3-4310-90c9-c3379838d103-kube-api-access-lpt4b\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.975054 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.974924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e78835ff-aef3-4310-90c9-c3379838d103-console-config\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.975054 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.974968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e78835ff-aef3-4310-90c9-c3379838d103-service-ca\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.975728 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.975690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e78835ff-aef3-4310-90c9-c3379838d103-oauth-serving-cert\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.975840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.975815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e78835ff-aef3-4310-90c9-c3379838d103-trusted-ca-bundle\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.975840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.975825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e78835ff-aef3-4310-90c9-c3379838d103-console-config\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.975911 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.975845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e78835ff-aef3-4310-90c9-c3379838d103-service-ca\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.977366 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.977346 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e78835ff-aef3-4310-90c9-c3379838d103-console-oauth-config\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.977459 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.977441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e78835ff-aef3-4310-90c9-c3379838d103-console-serving-cert\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:05.983601 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:05.983552 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpt4b\" (UniqueName: \"kubernetes.io/projected/e78835ff-aef3-4310-90c9-c3379838d103-kube-api-access-lpt4b\") pod \"console-764679f488-nl8lb\" (UID: \"e78835ff-aef3-4310-90c9-c3379838d103\") " pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:06.044627 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:06.044590 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:06.168868 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:06.168842 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764679f488-nl8lb"] Apr 24 21:23:06.171508 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:23:06.171482 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode78835ff_aef3_4310_90c9_c3379838d103.slice/crio-37a9a6c31aa9630349b16b84bb35a4aa3f2d57d64571b64fb00bb9f276f18edf WatchSource:0}: Error finding container 37a9a6c31aa9630349b16b84bb35a4aa3f2d57d64571b64fb00bb9f276f18edf: Status 404 returned error can't find the container with id 37a9a6c31aa9630349b16b84bb35a4aa3f2d57d64571b64fb00bb9f276f18edf Apr 24 21:23:06.298375 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:06.298345 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764679f488-nl8lb" event={"ID":"e78835ff-aef3-4310-90c9-c3379838d103","Type":"ContainerStarted","Data":"0dca9bd8f519219a63cf1d9c52291645901c60dbe35016e8f8f7cfd69bfda027"} Apr 24 21:23:06.298559 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:06.298383 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764679f488-nl8lb" event={"ID":"e78835ff-aef3-4310-90c9-c3379838d103","Type":"ContainerStarted","Data":"37a9a6c31aa9630349b16b84bb35a4aa3f2d57d64571b64fb00bb9f276f18edf"} Apr 24 21:23:06.318379 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:06.318325 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-764679f488-nl8lb" podStartSLOduration=1.318310654 podStartE2EDuration="1.318310654s" podCreationTimestamp="2026-04-24 21:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:23:06.3182741 +0000 UTC m=+384.919429673" watchObservedRunningTime="2026-04-24 21:23:06.318310654 +0000 UTC m=+384.919466215" Apr 24 21:23:12.288824 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:12.288796 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-d727s" Apr 24 21:23:16.045080 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:16.045034 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:16.045080 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:16.045087 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:16.049829 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:16.049802 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:16.342227 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:16.342197 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-764679f488-nl8lb" Apr 24 21:23:16.404680 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:16.404647 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b54c958b4-lsfg5"] Apr 24 21:23:41.424886 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.424829 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b54c958b4-lsfg5" podUID="c1870a54-cb8c-4a45-ab0f-3a74da0062cd" containerName="console" containerID="cri-o://98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e" gracePeriod=15 Apr 24 21:23:41.655744 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.655722 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b54c958b4-lsfg5_c1870a54-cb8c-4a45-ab0f-3a74da0062cd/console/0.log" Apr 24 21:23:41.655868 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.655784 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:23:41.766692 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.766596 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-oauth-config\") pod \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " Apr 24 21:23:41.766692 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.766675 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-trusted-ca-bundle\") pod \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " Apr 24 21:23:41.766916 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.766714 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-config\") pod \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " Apr 24 21:23:41.766916 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.766737 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-oauth-serving-cert\") pod \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " Apr 24 21:23:41.766916 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.766766 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-serving-cert\") pod \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " Apr 24 21:23:41.766916 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.766798 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-service-ca\") pod \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " Apr 24 21:23:41.766916 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.766837 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9hhk\" (UniqueName: \"kubernetes.io/projected/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-kube-api-access-n9hhk\") pod \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\" (UID: \"c1870a54-cb8c-4a45-ab0f-3a74da0062cd\") " Apr 24 21:23:41.767196 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.767127 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c1870a54-cb8c-4a45-ab0f-3a74da0062cd" (UID: "c1870a54-cb8c-4a45-ab0f-3a74da0062cd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:23:41.767257 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.767203 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-service-ca" (OuterVolumeSpecName: "service-ca") pod "c1870a54-cb8c-4a45-ab0f-3a74da0062cd" (UID: "c1870a54-cb8c-4a45-ab0f-3a74da0062cd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:23:41.767257 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.767209 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c1870a54-cb8c-4a45-ab0f-3a74da0062cd" (UID: "c1870a54-cb8c-4a45-ab0f-3a74da0062cd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:23:41.767390 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.767365 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-config" (OuterVolumeSpecName: "console-config") pod "c1870a54-cb8c-4a45-ab0f-3a74da0062cd" (UID: "c1870a54-cb8c-4a45-ab0f-3a74da0062cd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:23:41.768840 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.768810 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c1870a54-cb8c-4a45-ab0f-3a74da0062cd" (UID: "c1870a54-cb8c-4a45-ab0f-3a74da0062cd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:23:41.769009 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.768986 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-kube-api-access-n9hhk" (OuterVolumeSpecName: "kube-api-access-n9hhk") pod "c1870a54-cb8c-4a45-ab0f-3a74da0062cd" (UID: "c1870a54-cb8c-4a45-ab0f-3a74da0062cd"). InnerVolumeSpecName "kube-api-access-n9hhk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:23:41.769109 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.769029 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c1870a54-cb8c-4a45-ab0f-3a74da0062cd" (UID: "c1870a54-cb8c-4a45-ab0f-3a74da0062cd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:23:41.868083 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.868043 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-oauth-config\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:23:41.868264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.868095 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-trusted-ca-bundle\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:23:41.868264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.868110 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-config\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:23:41.868264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.868122 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-oauth-serving-cert\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:23:41.868264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.868133 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-console-serving-cert\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:23:41.868264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.868146 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-service-ca\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:23:41.868264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:41.868158 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n9hhk\" (UniqueName: \"kubernetes.io/projected/c1870a54-cb8c-4a45-ab0f-3a74da0062cd-kube-api-access-n9hhk\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:23:42.417428 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:42.417393 2575 generic.go:358] "Generic (PLEG): container finished" podID="c1870a54-cb8c-4a45-ab0f-3a74da0062cd" containerID="98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e" exitCode=2 Apr 24 21:23:42.417609 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:42.417452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b54c958b4-lsfg5" event={"ID":"c1870a54-cb8c-4a45-ab0f-3a74da0062cd","Type":"ContainerDied","Data":"98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e"} Apr 24 21:23:42.417609 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:42.417486 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b54c958b4-lsfg5" event={"ID":"c1870a54-cb8c-4a45-ab0f-3a74da0062cd","Type":"ContainerDied","Data":"2932710e2ddfbc1e9c6d417f60950e402e590bc7385cdddf71efe331a592e3b4"} Apr 24 21:23:42.417609 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:42.417489 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b54c958b4-lsfg5" Apr 24 21:23:42.417609 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:42.417506 2575 scope.go:117] "RemoveContainer" containerID="98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e" Apr 24 21:23:42.425173 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:42.425151 2575 scope.go:117] "RemoveContainer" containerID="98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e" Apr 24 21:23:42.425452 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:23:42.425411 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e\": container with ID starting with 98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e not found: ID does not exist" containerID="98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e" Apr 24 21:23:42.425452 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:42.425437 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e"} err="failed to get container status \"98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e\": rpc error: code = NotFound desc = could not find container \"98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e\": container with ID starting with 98b8ed34040ab36df8dbc782e2d3dda48ff53fe7258957ce26d3a1445503d70e not found: ID does not exist" Apr 24 21:23:42.468577 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:42.468540 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b54c958b4-lsfg5"] Apr 24 21:23:42.472751 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:42.472720 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b54c958b4-lsfg5"] Apr 24 21:23:44.006208 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:23:44.006174 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1870a54-cb8c-4a45-ab0f-3a74da0062cd" path="/var/lib/kubelet/pods/c1870a54-cb8c-4a45-ab0f-3a74da0062cd/volumes" Apr 24 21:26:41.906716 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:26:41.906684 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:26:41.907232 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:26:41.907139 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:27:00.249804 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.249715 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2"] Apr 24 21:27:00.252170 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.250107 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1870a54-cb8c-4a45-ab0f-3a74da0062cd" containerName="console" Apr 24 21:27:00.252170 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.250121 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1870a54-cb8c-4a45-ab0f-3a74da0062cd" containerName="console" Apr 24 21:27:00.252170 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.250168 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1870a54-cb8c-4a45-ab0f-3a74da0062cd" containerName="console" Apr 24 21:27:00.253081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.253045 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:00.255346 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.255325 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-da412-serving-cert\"" Apr 24 21:27:00.255702 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.255681 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:27:00.255702 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.255698 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-da412-kube-rbac-proxy-sar-config\"" Apr 24 21:27:00.255865 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.255768 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4gsmf\"" Apr 24 21:27:00.262844 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.262822 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2"] Apr 24 21:27:00.309551 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.309519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b7862b8-8f4d-4383-a836-02b97d0c81a1-openshift-service-ca-bundle\") pod \"model-chainer-raw-da412-6cd7bcff7-rvqs2\" (UID: \"0b7862b8-8f4d-4383-a836-02b97d0c81a1\") " pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:00.309738 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.309579 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b7862b8-8f4d-4383-a836-02b97d0c81a1-proxy-tls\") pod \"model-chainer-raw-da412-6cd7bcff7-rvqs2\" (UID: \"0b7862b8-8f4d-4383-a836-02b97d0c81a1\") " pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:00.410701 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.410666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b7862b8-8f4d-4383-a836-02b97d0c81a1-openshift-service-ca-bundle\") pod \"model-chainer-raw-da412-6cd7bcff7-rvqs2\" (UID: \"0b7862b8-8f4d-4383-a836-02b97d0c81a1\") " pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:00.410908 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.410715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b7862b8-8f4d-4383-a836-02b97d0c81a1-proxy-tls\") pod \"model-chainer-raw-da412-6cd7bcff7-rvqs2\" (UID: \"0b7862b8-8f4d-4383-a836-02b97d0c81a1\") " pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:00.410908 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:27:00.410837 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-da412-serving-cert: secret "model-chainer-raw-da412-serving-cert" not found Apr 24 21:27:00.410908 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:27:00.410898 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b7862b8-8f4d-4383-a836-02b97d0c81a1-proxy-tls podName:0b7862b8-8f4d-4383-a836-02b97d0c81a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.91088016 +0000 UTC m=+619.512035703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0b7862b8-8f4d-4383-a836-02b97d0c81a1-proxy-tls") pod "model-chainer-raw-da412-6cd7bcff7-rvqs2" (UID: "0b7862b8-8f4d-4383-a836-02b97d0c81a1") : secret "model-chainer-raw-da412-serving-cert" not found Apr 24 21:27:00.411322 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.411302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b7862b8-8f4d-4383-a836-02b97d0c81a1-openshift-service-ca-bundle\") pod \"model-chainer-raw-da412-6cd7bcff7-rvqs2\" (UID: \"0b7862b8-8f4d-4383-a836-02b97d0c81a1\") " pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:00.913981 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.913940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b7862b8-8f4d-4383-a836-02b97d0c81a1-proxy-tls\") pod \"model-chainer-raw-da412-6cd7bcff7-rvqs2\" (UID: \"0b7862b8-8f4d-4383-a836-02b97d0c81a1\") " pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:00.916406 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:00.916386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b7862b8-8f4d-4383-a836-02b97d0c81a1-proxy-tls\") pod \"model-chainer-raw-da412-6cd7bcff7-rvqs2\" (UID: \"0b7862b8-8f4d-4383-a836-02b97d0c81a1\") " pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:01.164409 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:01.164292 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:01.284472 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:01.284444 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2"] Apr 24 21:27:01.294553 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:01.294530 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:02.039636 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:02.039604 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" event={"ID":"0b7862b8-8f4d-4383-a836-02b97d0c81a1","Type":"ContainerStarted","Data":"fada5e66ed693ff02ac97fd6cdd87d8942a3d351d800faf066def0c075bee929"} Apr 24 21:27:04.048126 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:04.048092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" event={"ID":"0b7862b8-8f4d-4383-a836-02b97d0c81a1","Type":"ContainerStarted","Data":"5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03"} Apr 24 21:27:04.048540 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:04.048193 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:04.066469 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:04.066413 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" podStartSLOduration=1.926603224 podStartE2EDuration="4.066399337s" podCreationTimestamp="2026-04-24 21:27:00 +0000 UTC" firstStartedPulling="2026-04-24 21:27:01.294654763 +0000 UTC m=+619.895810302" lastFinishedPulling="2026-04-24 21:27:03.43445087 +0000 UTC m=+622.035606415" observedRunningTime="2026-04-24 21:27:04.065503922 +0000 UTC m=+622.666659494" watchObservedRunningTime="2026-04-24 21:27:04.066399337 +0000 UTC m=+622.667554899" Apr 24 21:27:10.057504 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:10.057475 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:10.293337 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:10.293305 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2"] Apr 24 21:27:10.293552 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:10.293513 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" podUID="0b7862b8-8f4d-4383-a836-02b97d0c81a1" containerName="model-chainer-raw-da412" containerID="cri-o://5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03" gracePeriod=30 Apr 24 21:27:15.055656 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:15.055617 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" podUID="0b7862b8-8f4d-4383-a836-02b97d0c81a1" containerName="model-chainer-raw-da412" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:20.055737 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:20.055696 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" podUID="0b7862b8-8f4d-4383-a836-02b97d0c81a1" containerName="model-chainer-raw-da412" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:25.056146 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:25.056101 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" podUID="0b7862b8-8f4d-4383-a836-02b97d0c81a1" containerName="model-chainer-raw-da412" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:25.056619 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:25.056217 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:30.055330 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:30.055294 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" podUID="0b7862b8-8f4d-4383-a836-02b97d0c81a1" containerName="model-chainer-raw-da412" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:35.056008 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:35.055966 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" podUID="0b7862b8-8f4d-4383-a836-02b97d0c81a1" containerName="model-chainer-raw-da412" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:40.055547 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:40.055505 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" podUID="0b7862b8-8f4d-4383-a836-02b97d0c81a1" containerName="model-chainer-raw-da412" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:40.931241 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:40.931212 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:41.025583 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.025546 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b7862b8-8f4d-4383-a836-02b97d0c81a1-proxy-tls\") pod \"0b7862b8-8f4d-4383-a836-02b97d0c81a1\" (UID: \"0b7862b8-8f4d-4383-a836-02b97d0c81a1\") " Apr 24 21:27:41.025762 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.025627 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b7862b8-8f4d-4383-a836-02b97d0c81a1-openshift-service-ca-bundle\") pod \"0b7862b8-8f4d-4383-a836-02b97d0c81a1\" (UID: \"0b7862b8-8f4d-4383-a836-02b97d0c81a1\") " Apr 24 21:27:41.025970 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.025943 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b7862b8-8f4d-4383-a836-02b97d0c81a1-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "0b7862b8-8f4d-4383-a836-02b97d0c81a1" (UID: "0b7862b8-8f4d-4383-a836-02b97d0c81a1"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:27:41.027750 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.027724 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7862b8-8f4d-4383-a836-02b97d0c81a1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b7862b8-8f4d-4383-a836-02b97d0c81a1" (UID: "0b7862b8-8f4d-4383-a836-02b97d0c81a1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:27:41.126948 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.126844 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b7862b8-8f4d-4383-a836-02b97d0c81a1-proxy-tls\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:27:41.126948 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.126891 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b7862b8-8f4d-4383-a836-02b97d0c81a1-openshift-service-ca-bundle\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:27:41.163910 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.163870 2575 generic.go:358] "Generic (PLEG): container finished" podID="0b7862b8-8f4d-4383-a836-02b97d0c81a1" containerID="5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03" exitCode=0 Apr 24 21:27:41.164096 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.163945 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" Apr 24 21:27:41.164096 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.163958 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" event={"ID":"0b7862b8-8f4d-4383-a836-02b97d0c81a1","Type":"ContainerDied","Data":"5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03"} Apr 24 21:27:41.164096 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.164002 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2" event={"ID":"0b7862b8-8f4d-4383-a836-02b97d0c81a1","Type":"ContainerDied","Data":"fada5e66ed693ff02ac97fd6cdd87d8942a3d351d800faf066def0c075bee929"} Apr 24 21:27:41.164096 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.164021 2575 scope.go:117] "RemoveContainer" containerID="5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03" Apr 24 21:27:41.171873 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.171843 2575 scope.go:117] "RemoveContainer" containerID="5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03" Apr 24 21:27:41.172240 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:27:41.172216 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03\": container with ID starting with 5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03 not found: ID does not exist" containerID="5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03" Apr 24 21:27:41.172336 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.172248 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03"} err="failed to get container status \"5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03\": rpc error: code = NotFound desc = could not find container \"5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03\": container with ID starting with 5adabbe34275f2cf5d319c5e4b5ef5f249feef5e0e98a0d9e89be1807b7aed03 not found: ID does not exist" Apr 24 21:27:41.184318 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.184294 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2"] Apr 24 21:27:41.187870 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:41.187849 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-da412-6cd7bcff7-rvqs2"] Apr 24 21:27:42.007924 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:27:42.007891 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b7862b8-8f4d-4383-a836-02b97d0c81a1" path="/var/lib/kubelet/pods/0b7862b8-8f4d-4383-a836-02b97d0c81a1/volumes" Apr 24 21:28:40.538828 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.538739 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh"] Apr 24 21:28:40.539577 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.539224 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b7862b8-8f4d-4383-a836-02b97d0c81a1" containerName="model-chainer-raw-da412" Apr 24 21:28:40.539577 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.539244 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7862b8-8f4d-4383-a836-02b97d0c81a1" containerName="model-chainer-raw-da412" Apr 24 21:28:40.539577 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.539316 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b7862b8-8f4d-4383-a836-02b97d0c81a1" containerName="model-chainer-raw-da412" Apr 24 21:28:40.542372 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.542345 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:28:40.545006 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.544978 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-07ea4-serving-cert\"" Apr 24 21:28:40.545172 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.544979 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-07ea4-kube-rbac-proxy-sar-config\"" Apr 24 21:28:40.545172 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.545100 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:28:40.545172 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.545102 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4gsmf\"" Apr 24 21:28:40.552564 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.552539 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh"] Apr 24 21:28:40.592762 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.592711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b958c6d-4937-418e-80f3-92a4d099c830-proxy-tls\") pod \"model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh\" (UID: \"5b958c6d-4937-418e-80f3-92a4d099c830\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:28:40.592967 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.592782 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b958c6d-4937-418e-80f3-92a4d099c830-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh\" (UID: \"5b958c6d-4937-418e-80f3-92a4d099c830\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:28:40.693344 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.693304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b958c6d-4937-418e-80f3-92a4d099c830-proxy-tls\") pod \"model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh\" (UID: \"5b958c6d-4937-418e-80f3-92a4d099c830\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:28:40.693344 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.693350 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b958c6d-4937-418e-80f3-92a4d099c830-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh\" (UID: \"5b958c6d-4937-418e-80f3-92a4d099c830\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:28:40.693544 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:28:40.693484 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-serving-cert: secret "model-chainer-raw-hpa-07ea4-serving-cert" not found Apr 24 21:28:40.693590 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:28:40.693565 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b958c6d-4937-418e-80f3-92a4d099c830-proxy-tls podName:5b958c6d-4937-418e-80f3-92a4d099c830 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:41.1935466 +0000 UTC m=+719.794702139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5b958c6d-4937-418e-80f3-92a4d099c830-proxy-tls") pod "model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" (UID: "5b958c6d-4937-418e-80f3-92a4d099c830") : secret "model-chainer-raw-hpa-07ea4-serving-cert" not found Apr 24 21:28:40.694008 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:40.693987 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b958c6d-4937-418e-80f3-92a4d099c830-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh\" (UID: \"5b958c6d-4937-418e-80f3-92a4d099c830\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:28:41.196918 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:41.196870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b958c6d-4937-418e-80f3-92a4d099c830-proxy-tls\") pod \"model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh\" (UID: \"5b958c6d-4937-418e-80f3-92a4d099c830\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:28:41.199263 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:41.199235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b958c6d-4937-418e-80f3-92a4d099c830-proxy-tls\") pod \"model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh\" (UID: \"5b958c6d-4937-418e-80f3-92a4d099c830\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:28:41.454038 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:41.453932 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:28:41.575395 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:41.575365 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh"] Apr 24 21:28:41.577789 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:28:41.577760 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b958c6d_4937_418e_80f3_92a4d099c830.slice/crio-1ab6a697a57a6a2e94ffbc1b9c5786fed712bde641795d40f5a2007c9314540b WatchSource:0}: Error finding container 1ab6a697a57a6a2e94ffbc1b9c5786fed712bde641795d40f5a2007c9314540b: Status 404 returned error can't find the container with id 1ab6a697a57a6a2e94ffbc1b9c5786fed712bde641795d40f5a2007c9314540b Apr 24 21:28:42.358175 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:42.358108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" event={"ID":"5b958c6d-4937-418e-80f3-92a4d099c830","Type":"ContainerStarted","Data":"3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8"} Apr 24 21:28:42.358175 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:42.358147 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" event={"ID":"5b958c6d-4937-418e-80f3-92a4d099c830","Type":"ContainerStarted","Data":"1ab6a697a57a6a2e94ffbc1b9c5786fed712bde641795d40f5a2007c9314540b"} Apr 24 21:28:42.358175 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:42.358184 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:28:42.373543 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:42.373496 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" podStartSLOduration=2.373480597 podStartE2EDuration="2.373480597s" podCreationTimestamp="2026-04-24 21:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:42.372203397 +0000 UTC m=+720.973358972" watchObservedRunningTime="2026-04-24 21:28:42.373480597 +0000 UTC m=+720.974636157" Apr 24 21:28:48.367081 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:48.367030 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:28:50.588539 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:50.588506 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh"] Apr 24 21:28:50.589039 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:50.588763 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" podUID="5b958c6d-4937-418e-80f3-92a4d099c830" containerName="model-chainer-raw-hpa-07ea4" containerID="cri-o://3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8" gracePeriod=30 Apr 24 21:28:53.366168 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:53.366125 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" podUID="5b958c6d-4937-418e-80f3-92a4d099c830" containerName="model-chainer-raw-hpa-07ea4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:28:58.365808 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:28:58.365766 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" podUID="5b958c6d-4937-418e-80f3-92a4d099c830" containerName="model-chainer-raw-hpa-07ea4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:03.365600 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:03.365561 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" podUID="5b958c6d-4937-418e-80f3-92a4d099c830" containerName="model-chainer-raw-hpa-07ea4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:03.366098 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:03.365733 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:29:08.366301 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:08.366261 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" podUID="5b958c6d-4937-418e-80f3-92a4d099c830" containerName="model-chainer-raw-hpa-07ea4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:13.365860 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:13.365817 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" podUID="5b958c6d-4937-418e-80f3-92a4d099c830" containerName="model-chainer-raw-hpa-07ea4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:18.365833 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:18.365793 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" podUID="5b958c6d-4937-418e-80f3-92a4d099c830" containerName="model-chainer-raw-hpa-07ea4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:29:20.740485 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:20.740458 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:29:20.924932 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:20.924839 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b958c6d-4937-418e-80f3-92a4d099c830-proxy-tls\") pod \"5b958c6d-4937-418e-80f3-92a4d099c830\" (UID: \"5b958c6d-4937-418e-80f3-92a4d099c830\") " Apr 24 21:29:20.924932 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:20.924910 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b958c6d-4937-418e-80f3-92a4d099c830-openshift-service-ca-bundle\") pod \"5b958c6d-4937-418e-80f3-92a4d099c830\" (UID: \"5b958c6d-4937-418e-80f3-92a4d099c830\") " Apr 24 21:29:20.925219 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:20.925193 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b958c6d-4937-418e-80f3-92a4d099c830-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "5b958c6d-4937-418e-80f3-92a4d099c830" (UID: "5b958c6d-4937-418e-80f3-92a4d099c830"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:20.926906 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:20.926880 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b958c6d-4937-418e-80f3-92a4d099c830-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5b958c6d-4937-418e-80f3-92a4d099c830" (UID: "5b958c6d-4937-418e-80f3-92a4d099c830"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:21.026207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:21.026172 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b958c6d-4937-418e-80f3-92a4d099c830-openshift-service-ca-bundle\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:29:21.026207 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:21.026202 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b958c6d-4937-418e-80f3-92a4d099c830-proxy-tls\") on node \"ip-10-0-141-46.ec2.internal\" DevicePath \"\"" Apr 24 21:29:21.476986 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:21.476948 2575 generic.go:358] "Generic (PLEG): container finished" podID="5b958c6d-4937-418e-80f3-92a4d099c830" containerID="3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8" exitCode=0 Apr 24 21:29:21.476986 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:21.476992 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" event={"ID":"5b958c6d-4937-418e-80f3-92a4d099c830","Type":"ContainerDied","Data":"3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8"} Apr 24 21:29:21.477341 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:21.477012 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" Apr 24 21:29:21.477341 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:21.477014 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh" event={"ID":"5b958c6d-4937-418e-80f3-92a4d099c830","Type":"ContainerDied","Data":"1ab6a697a57a6a2e94ffbc1b9c5786fed712bde641795d40f5a2007c9314540b"} Apr 24 21:29:21.477341 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:21.477034 2575 scope.go:117] "RemoveContainer" containerID="3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8" Apr 24 21:29:21.485056 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:21.485037 2575 scope.go:117] "RemoveContainer" containerID="3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8" Apr 24 21:29:21.485349 ip-10-0-141-46 kubenswrapper[2575]: E0424 21:29:21.485330 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8\": container with ID starting with 3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8 not found: ID does not exist" containerID="3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8" Apr 24 21:29:21.485408 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:21.485360 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8"} err="failed to get container status \"3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8\": rpc error: code = NotFound desc = could not find container \"3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8\": container with ID starting with 3042d4667787b34f6173c2c01c0b20da0d04cbfdf48f69757e0b2e1b3a1ebba8 not found: ID does not exist" Apr 24 21:29:21.513789 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:21.513756 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh"] Apr 24 21:29:21.528679 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:21.528648 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-07ea4-759d56d8b5-jpmwh"] Apr 24 21:29:22.006482 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:29:22.006450 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b958c6d-4937-418e-80f3-92a4d099c830" path="/var/lib/kubelet/pods/5b958c6d-4937-418e-80f3-92a4d099c830/volumes" Apr 24 21:31:41.925264 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:31:41.925190 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:31:41.927501 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:31:41.927478 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:36:41.952329 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:36:41.952302 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:36:41.954132 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:36:41.954111 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:38:02.788866 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:02.788836 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fvzdq_527f1b83-b8e1-4bd9-bb12-bbf7a6ab672a/global-pull-secret-syncer/0.log" Apr 24 21:38:02.888024 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:02.887987 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-5r6d9_af0483bc-341e-4d8b-a4d1-0e5fb17d43a8/konnectivity-agent/0.log" Apr 24 21:38:03.053147 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:03.053116 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-46.ec2.internal_6551c028e13aff466c38397d8a508ac4/haproxy/0.log" Apr 24 21:38:06.844816 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:06.844787 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4f4tq_721680e7-1c21-4de2-ad20-76f3ca88fdf4/node-exporter/0.log" Apr 24 21:38:06.868156 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:06.868121 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4f4tq_721680e7-1c21-4de2-ad20-76f3ca88fdf4/kube-rbac-proxy/0.log" Apr 24 21:38:06.895425 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:06.895399 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4f4tq_721680e7-1c21-4de2-ad20-76f3ca88fdf4/init-textfile/0.log" Apr 24 21:38:07.517337 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:07.517307 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-7j6fz_871d248c-2080-42f6-b548-3ed0e9f28ff0/prometheus-operator/0.log" Apr 24 21:38:07.545988 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:07.545961 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-7j6fz_871d248c-2080-42f6-b548-3ed0e9f28ff0/kube-rbac-proxy/0.log" Apr 24 21:38:07.584005 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:07.583975 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-9sb4g_15cf2e85-2ffe-4348-8ba9-0a843b7ba7e6/prometheus-operator-admission-webhook/0.log" Apr 24 21:38:09.663080 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.663033 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm"] Apr 24 21:38:09.663459 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.663440 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b958c6d-4937-418e-80f3-92a4d099c830" containerName="model-chainer-raw-hpa-07ea4" Apr 24 21:38:09.663503 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.663464 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b958c6d-4937-418e-80f3-92a4d099c830" containerName="model-chainer-raw-hpa-07ea4" Apr 24 21:38:09.663561 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.663548 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b958c6d-4937-418e-80f3-92a4d099c830" containerName="model-chainer-raw-hpa-07ea4" Apr 24 21:38:09.666485 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.666465 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.669594 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.669574 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pg6qz\"/\"openshift-service-ca.crt\"" Apr 24 21:38:09.669733 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.669581 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pg6qz\"/\"kube-root-ca.crt\"" Apr 24 21:38:09.669733 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.669628 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pg6qz\"/\"default-dockercfg-xvlcd\"" Apr 24 21:38:09.675344 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.675319 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm"] Apr 24 21:38:09.731360 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.731323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9df5973c-d7ae-43d1-a338-bcbe805a1329-proc\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.731360 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.731363 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9df5973c-d7ae-43d1-a338-bcbe805a1329-lib-modules\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.731593 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.731488 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s9kd\" (UniqueName: \"kubernetes.io/projected/9df5973c-d7ae-43d1-a338-bcbe805a1329-kube-api-access-4s9kd\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.731593 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.731523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9df5973c-d7ae-43d1-a338-bcbe805a1329-sys\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.731593 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.731540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9df5973c-d7ae-43d1-a338-bcbe805a1329-podres\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.832677 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.832640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4s9kd\" (UniqueName: \"kubernetes.io/projected/9df5973c-d7ae-43d1-a338-bcbe805a1329-kube-api-access-4s9kd\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.832874 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.832693 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9df5973c-d7ae-43d1-a338-bcbe805a1329-sys\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.832874 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.832757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9df5973c-d7ae-43d1-a338-bcbe805a1329-sys\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.832874 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.832798 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9df5973c-d7ae-43d1-a338-bcbe805a1329-podres\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.832874 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.832834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9df5973c-d7ae-43d1-a338-bcbe805a1329-proc\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.832874 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.832859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9df5973c-d7ae-43d1-a338-bcbe805a1329-lib-modules\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.833155 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.832902 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9df5973c-d7ae-43d1-a338-bcbe805a1329-proc\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.833155 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.832961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9df5973c-d7ae-43d1-a338-bcbe805a1329-podres\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.833155 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.833002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9df5973c-d7ae-43d1-a338-bcbe805a1329-lib-modules\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.842902 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.842877 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s9kd\" (UniqueName: \"kubernetes.io/projected/9df5973c-d7ae-43d1-a338-bcbe805a1329-kube-api-access-4s9kd\") pod \"perf-node-gather-daemonset-hbpfm\" (UID: \"9df5973c-d7ae-43d1-a338-bcbe805a1329\") " pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:09.976947 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:09.976852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:10.010222 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:10.010155 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-764679f488-nl8lb_e78835ff-aef3-4310-90c9-c3379838d103/console/0.log" Apr 24 21:38:10.053571 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:10.053528 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-jpt2j_3aed0d1c-5d90-4a85-912d-d13220c855e2/download-server/0.log" Apr 24 21:38:10.105940 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:10.105911 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm"] Apr 24 21:38:10.108476 ip-10-0-141-46 kubenswrapper[2575]: W0424 21:38:10.108434 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9df5973c_d7ae_43d1_a338_bcbe805a1329.slice/crio-d7b548696c7034f214f70b0e9ed3cd8e88e333c968406378dcedc918ea8a78fa WatchSource:0}: Error finding container d7b548696c7034f214f70b0e9ed3cd8e88e333c968406378dcedc918ea8a78fa: Status 404 returned error can't find the container with id d7b548696c7034f214f70b0e9ed3cd8e88e333c968406378dcedc918ea8a78fa Apr 24 21:38:10.110038 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:10.110023 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:38:10.116080 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:10.116042 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" event={"ID":"9df5973c-d7ae-43d1-a338-bcbe805a1329","Type":"ContainerStarted","Data":"d7b548696c7034f214f70b0e9ed3cd8e88e333c968406378dcedc918ea8a78fa"} Apr 24 21:38:11.120590 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:11.120554 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" event={"ID":"9df5973c-d7ae-43d1-a338-bcbe805a1329","Type":"ContainerStarted","Data":"52be98fb9c168125ba52778ceabd2d4660f869112223268cb4144bef0ae78de7"} Apr 24 21:38:11.120958 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:11.120707 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:11.142877 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:11.142829 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" podStartSLOduration=2.142813555 podStartE2EDuration="2.142813555s" podCreationTimestamp="2026-04-24 21:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:38:11.141386699 +0000 UTC m=+1289.742542259" watchObservedRunningTime="2026-04-24 21:38:11.142813555 +0000 UTC m=+1289.743969118" Apr 24 21:38:11.318783 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:11.318753 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4jr7n_22015fa1-7a0e-402e-b16d-c8403c4becda/dns/0.log" Apr 24 21:38:11.346261 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:11.346229 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4jr7n_22015fa1-7a0e-402e-b16d-c8403c4becda/kube-rbac-proxy/0.log" Apr 24 21:38:11.563627 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:11.563598 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rbbz9_aae051c8-9a36-4936-b30a-80aff28bff26/dns-node-resolver/0.log" Apr 24 21:38:12.025372 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:12.025295 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-74b659f8d6-f49ft_07b1e968-a375-49e9-81d4-c36509044321/registry/0.log" Apr 24 21:38:12.092594 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:12.092550 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d7mjl_d887b4fc-65dd-41bb-9013-2e633d3ab7a8/node-ca/0.log" Apr 24 21:38:13.373502 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:13.373473 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-z8ffq_ccd8485f-5d27-4b5c-8155-58281c06943e/serve-healthcheck-canary/0.log" Apr 24 21:38:13.977881 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:13.977840 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l2wft_a78c9c94-bf81-4cf1-9862-dd3d48a12eba/kube-rbac-proxy/0.log" Apr 24 21:38:14.001903 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:14.001870 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l2wft_a78c9c94-bf81-4cf1-9862-dd3d48a12eba/exporter/0.log" Apr 24 21:38:14.042695 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:14.042663 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l2wft_a78c9c94-bf81-4cf1-9862-dd3d48a12eba/extractor/0.log" Apr 24 21:38:16.059853 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:16.059818 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-74fc8f6f96-wpmzw_8a492e9d-76bb-4b11-a920-f3cad9b5915d/manager/0.log" Apr 24 21:38:16.092611 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:16.092579 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-ngqtv_f23d65a8-d1c8-46a5-9de4-c141395162b0/manager/0.log" Apr 24 21:38:16.118601 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:16.118575 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-d727s_3433e7de-2e72-444e-86cc-d175dfdf9fad/server/0.log" Apr 24 21:38:16.353292 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:16.353201 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-dznv7_23a5d406-2cd1-44fb-81d8-8bb8c63f5ea0/seaweedfs/0.log" Apr 24 21:38:17.133889 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:17.133863 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-pg6qz/perf-node-gather-daemonset-hbpfm" Apr 24 21:38:21.968125 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:21.968098 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8829_5b89c06b-ff11-4cc0-bd26-7f792a0f1702/kube-multus-additional-cni-plugins/0.log" Apr 24 21:38:21.992012 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:21.991982 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8829_5b89c06b-ff11-4cc0-bd26-7f792a0f1702/egress-router-binary-copy/0.log" Apr 24 21:38:22.013256 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:22.013222 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8829_5b89c06b-ff11-4cc0-bd26-7f792a0f1702/cni-plugins/0.log" Apr 24 21:38:22.047566 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:22.047537 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8829_5b89c06b-ff11-4cc0-bd26-7f792a0f1702/bond-cni-plugin/0.log" Apr 24 21:38:22.068492 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:22.068451 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8829_5b89c06b-ff11-4cc0-bd26-7f792a0f1702/routeoverride-cni/0.log" Apr 24 21:38:22.088658 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:22.088620 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8829_5b89c06b-ff11-4cc0-bd26-7f792a0f1702/whereabouts-cni-bincopy/0.log" Apr 24 21:38:22.109700 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:22.109672 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-r8829_5b89c06b-ff11-4cc0-bd26-7f792a0f1702/whereabouts-cni/0.log" Apr 24 21:38:22.365775 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:22.365735 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vlds8_2d51ef4d-9ece-4f73-b21d-de62e4a3b68e/kube-multus/0.log" Apr 24 21:38:22.443382 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:22.443346 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qm8hh_f3b8d86d-c179-4368-b025-0c7f41b2aa3e/network-metrics-daemon/0.log" Apr 24 21:38:22.471729 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:22.471690 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qm8hh_f3b8d86d-c179-4368-b025-0c7f41b2aa3e/kube-rbac-proxy/0.log" Apr 24 21:38:23.637099 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:23.637050 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-controller/0.log" Apr 24 21:38:23.656708 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:23.656681 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/0.log" Apr 24 21:38:23.668683 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:23.668659 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovn-acl-logging/1.log" Apr 24 21:38:23.687108 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:23.687053 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/kube-rbac-proxy-node/0.log" Apr 24 21:38:23.709271 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:23.709238 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 21:38:23.728082 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:23.728046 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/northd/0.log" Apr 24 21:38:23.757696 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:23.757667 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/nbdb/0.log" Apr 24 21:38:23.778963 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:23.778937 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/sbdb/0.log" Apr 24 21:38:23.945316 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:23.945226 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9z5w_e3b4de04-a724-4231-a103-ae88c77beb64/ovnkube-controller/0.log" Apr 24 21:38:25.362761 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:25.362729 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qfs2f_e42a906f-a474-4d11-8a0d-e8ef290b2e14/network-check-target-container/0.log" Apr 24 21:38:26.293791 ip-10-0-141-46 kubenswrapper[2575]: I0424 21:38:26.293757 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-8v98v_c230c44d-8923-409d-a9e3-443872457536/iptables-alerter/0.log"