Apr 16 14:27:09.697672 ip-10-0-141-239 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 14:27:09.697686 ip-10-0-141-239 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 14:27:09.697696 ip-10-0-141-239 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 14:27:09.698059 ip-10-0-141-239 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 14:27:19.821982 ip-10-0-141-239 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 14:27:19.821998 ip-10-0-141-239 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6c3e5bdf3d7940d19a96fc19c6fa22d3 -- Apr 16 14:29:41.656858 ip-10-0-141-239 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:29:42.089632 ip-10-0-141-239 kubenswrapper[2563]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:29:42.089632 ip-10-0-141-239 kubenswrapper[2563]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:29:42.089632 ip-10-0-141-239 kubenswrapper[2563]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:29:42.089632 ip-10-0-141-239 kubenswrapper[2563]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:29:42.089632 ip-10-0-141-239 kubenswrapper[2563]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:29:42.092825 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.092719 2563 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:29:42.095718 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095698 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:42.095718 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095717 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:42.095718 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095724 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:42.095718 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095728 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095732 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095737 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095740 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095744 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095747 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095751 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095754 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095758 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095762 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095766 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095770 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095774 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095778 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095782 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095786 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095790 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095794 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095798 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095802 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:42.095977 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095806 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095811 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095815 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095819 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095825 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095830 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095834 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095838 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095842 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095846 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095851 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095855 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095859 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095863 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095868 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095872 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095877 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095882 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095888 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095892 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:42.096844 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095896 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095901 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095905 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095909 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095914 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095918 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095922 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095927 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095931 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095935 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095939 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095943 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095948 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095952 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095957 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095962 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095966 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095970 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095974 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:42.097551 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095979 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095982 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095987 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095991 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.095996 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096000 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096005 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096009 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096013 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096018 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096022 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096028 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096032 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096036 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096041 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096045 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096049 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096056 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096061 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096065 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:42.098113 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096069 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096073 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096077 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096082 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096765 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096775 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096780 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096786 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096790 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096795 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096799 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096803 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096807 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096812 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096817 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096821 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096825 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096830 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096835 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096839 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:42.098993 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096844 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096849 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096852 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096857 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096862 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096866 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096870 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096875 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096879 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096883 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096888 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096892 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096896 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096900 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096904 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096909 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096913 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096917 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096921 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096926 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:42.099685 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096931 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096935 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096939 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096943 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096948 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096952 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096956 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096960 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096964 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096969 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096973 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096977 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096982 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096986 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096990 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.096994 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097000 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097005 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097009 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:42.100311 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097013 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097017 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097023 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097027 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097032 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097036 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097040 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097044 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097048 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097052 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097056 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097061 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097066 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097070 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097075 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097079 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097085 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097091 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097095 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097099 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:42.101022 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097103 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097108 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097112 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097116 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097125 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097131 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097137 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097142 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097146 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097151 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.097155 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.097935 2563 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.097950 2563 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.097962 2563 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.097970 2563 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.097977 2563 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.097982 2563 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.097989 2563 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.097996 2563 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098002 2563 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:29:42.101580 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098007 2563 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098013 2563 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098018 2563 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098023 2563 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098028 2563 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098033 2563 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098038 2563 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098043 2563 flags.go:64] FLAG: --cloud-config="" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098047 2563 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098053 2563 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098059 2563 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098064 2563 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098069 2563 flags.go:64] FLAG: --config-dir="" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098075 2563 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098081 2563 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098088 2563 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098093 2563 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098099 2563 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098104 2563 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098109 2563 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098114 2563 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098119 2563 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098124 2563 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098129 2563 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098137 2563 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:29:42.102100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098142 2563 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098147 2563 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098152 2563 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098157 2563 flags.go:64] FLAG: --enable-server="true" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098162 2563 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098170 2563 flags.go:64] FLAG: --event-burst="100" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098175 2563 flags.go:64] FLAG: --event-qps="50" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098179 2563 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098185 2563 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098190 2563 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098196 2563 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098201 2563 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098206 2563 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098211 2563 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098216 2563 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098221 2563 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098226 2563 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098231 2563 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098236 2563 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098240 2563 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098245 2563 flags.go:64] FLAG: --feature-gates="" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098251 2563 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098256 2563 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098262 2563 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098267 2563 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098273 2563 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:29:42.102817 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098278 2563 flags.go:64] FLAG: --help="false" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098283 2563 flags.go:64] FLAG: --hostname-override="ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098288 2563 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098293 2563 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098298 2563 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098305 2563 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098311 2563 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098316 2563 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098322 2563 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098326 2563 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098331 2563 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098336 2563 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098341 2563 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098346 2563 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098351 2563 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098356 2563 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098361 2563 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098366 2563 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098371 2563 flags.go:64] FLAG: --lock-file="" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098375 2563 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098380 2563 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098386 2563 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098395 2563 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:29:42.103478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098400 2563 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098405 2563 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098410 2563 flags.go:64] FLAG: --logging-format="text" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098415 2563 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098421 2563 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098425 2563 flags.go:64] FLAG: --manifest-url="" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098431 2563 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098438 2563 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098443 2563 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098450 2563 flags.go:64] FLAG: --max-pods="110" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098455 2563 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098460 2563 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098465 2563 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098469 2563 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098475 2563 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098481 2563 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098486 2563 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098498 2563 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098503 2563 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098508 2563 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098513 2563 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098518 2563 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098549 2563 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098554 2563 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:29:42.104064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098559 2563 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098564 2563 flags.go:64] FLAG: --port="10250" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098569 2563 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098575 2563 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-018bf941089915ccc" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098580 2563 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098585 2563 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098589 2563 flags.go:64] FLAG: --register-node="true" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098594 2563 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098600 2563 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098606 2563 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098611 2563 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098616 2563 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098620 2563 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098626 2563 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098632 2563 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098641 2563 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098646 2563 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098651 2563 flags.go:64] FLAG: --runonce="false" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098656 2563 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098661 2563 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098666 2563 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098671 2563 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098679 2563 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098685 2563 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098691 2563 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098696 2563 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:29:42.104681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098702 2563 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098707 2563 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098711 2563 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098716 2563 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098721 2563 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098726 2563 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098731 2563 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098740 2563 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098745 2563 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098750 2563 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098756 2563 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098761 2563 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098766 2563 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098770 2563 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098776 2563 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098780 2563 flags.go:64] FLAG: --v="2" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098787 2563 flags.go:64] FLAG: --version="false" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098793 2563 flags.go:64] FLAG: --vmodule="" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098800 2563 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.098806 2563 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.098963 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.098973 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.098977 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.098982 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:42.105373 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.098986 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.098990 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.098994 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.098997 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099003 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099007 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099011 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099016 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099020 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099025 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099029 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099034 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099038 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099042 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099046 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099050 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099055 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099059 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099063 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099067 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:42.106035 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099072 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099076 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099081 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099088 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099094 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099099 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099103 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099108 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099113 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099117 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099122 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099126 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099131 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099135 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099138 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099143 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099148 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099152 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099157 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:42.106841 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099161 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099165 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099170 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099174 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099178 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099182 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099186 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099190 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099194 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099198 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099203 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099207 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099211 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099215 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099220 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099224 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099228 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099232 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099236 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099241 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:42.107345 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099245 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099249 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099253 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099260 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099266 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099270 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099274 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099278 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099282 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099289 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099293 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099297 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099302 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099306 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099311 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099316 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099320 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099324 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099328 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099333 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:42.107873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099337 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099341 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.099345 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.099849 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.108196 2563 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.108213 2563 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108265 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108270 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108273 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108277 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108280 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108283 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108287 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108293 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108296 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:42.108370 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108299 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108302 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108305 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108307 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108311 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108314 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108317 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108320 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108322 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108325 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108328 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108331 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108333 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108336 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108338 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108341 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108348 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108351 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108354 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108357 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108360 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:42.108778 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108362 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108366 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108369 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108372 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108374 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108377 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108380 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108383 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108386 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108388 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108391 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108394 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108398 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108402 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108405 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108407 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108410 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108413 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108416 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:42.109315 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108418 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108421 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108424 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108426 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108429 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108432 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108434 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108437 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108440 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108443 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108445 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108448 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108451 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108453 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108458 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108460 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108463 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108465 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108468 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:42.109915 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108470 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108473 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108476 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108478 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108481 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108484 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108486 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108489 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108492 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108495 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108497 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108500 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108503 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108506 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108508 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108511 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108514 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:42.110397 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108517 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.108523 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108657 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108663 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108667 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108670 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108673 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108676 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108678 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108682 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108687 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108691 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108694 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108698 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108701 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:42.110854 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108704 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108707 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108710 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108713 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108715 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108718 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108721 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108724 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108727 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108730 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108732 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108735 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108738 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108740 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108743 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108745 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108748 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108751 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108753 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108756 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:42.111241 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108758 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108761 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108763 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108766 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108769 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108771 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108774 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108776 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108779 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108782 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108785 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108788 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108790 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108793 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108796 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108798 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108801 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108803 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108806 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108808 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108811 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:42.111752 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108813 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108816 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108819 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108821 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108824 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108826 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108829 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108831 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108834 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108837 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108839 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108842 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108845 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108847 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108849 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108852 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108854 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108857 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108859 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108863 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:42.112277 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108865 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108868 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108870 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108873 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108875 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108878 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108882 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108885 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108887 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108890 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108892 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:42.108895 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.108900 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:29:42.112789 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.109613 2563 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:29:42.115870 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.115854 2563 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:29:42.116839 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.116827 2563 server.go:1019] "Starting client certificate rotation" Apr 16 14:29:42.116949 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.116931 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:29:42.116985 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.116972 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:29:42.143220 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.143194 2563 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:29:42.145831 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.145812 2563 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:29:42.159939 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.159918 2563 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:29:42.165628 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.165608 2563 log.go:25] "Validated CRI v1 image API" Apr 16 14:29:42.166981 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.166954 2563 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:29:42.169587 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.169552 2563 fs.go:135] Filesystem UUIDs: map[746329bb-cb90-4533-8189-cf4d9cbf899d:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f63a17c5-5de5-44ad-ae96-31dc45bfda43:/dev/nvme0n1p3] Apr 16 14:29:42.169665 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.169586 2563 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:29:42.176903 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.176749 2563 manager.go:217] Machine: {Timestamp:2026-04-16 14:29:42.174379638 +0000 UTC m=+0.402109276 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102406 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f7e5cdf90729ef08ef7abc9f0d029 SystemUUID:ec2f7e5c-df90-729e-f08e-f7abc9f0d029 BootID:6c3e5bdf-3d79-40d1-9a96-fc19c6fa22d3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:98:b9:f6:cb:af Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:98:b9:f6:cb:af Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4e:0e:10:fe:7a:06 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:29:42.176903 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.176882 2563 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:29:42.177080 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.176997 2563 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:29:42.179410 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.179381 2563 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:29:42.179679 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.179413 2563 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-239.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:29:42.179679 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.179667 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:29:42.179828 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.179688 2563 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:29:42.179828 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.179698 2563 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:29:42.179828 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.179712 2563 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:29:42.179828 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.179731 2563 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:29:42.181359 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.181345 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:29:42.181493 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.181484 2563 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:29:42.183608 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.183594 2563 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:29:42.183646 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.183612 2563 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:29:42.183646 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.183626 2563 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:29:42.183646 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.183636 2563 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:29:42.183743 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.183648 2563 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:29:42.184849 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.184833 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:29:42.184943 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.184857 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:29:42.188148 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.188131 2563 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:29:42.189974 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.189957 2563 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:29:42.191488 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.191473 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:29:42.191567 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.191491 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:29:42.191567 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.191498 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:29:42.191567 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.191503 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:29:42.191567 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.191509 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:29:42.191567 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.191516 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:29:42.191567 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.191522 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:29:42.191567 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.191544 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:29:42.191567 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.191552 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:29:42.191567 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.191560 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:29:42.191922 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.191578 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:29:42.191922 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.191588 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:29:42.192441 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.192430 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:29:42.192476 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.192443 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:29:42.193836 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.193803 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:29:42.193910 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.193857 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-239.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:29:42.195340 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.195325 2563 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-239.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:29:42.196384 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.196370 2563 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:29:42.196447 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.196412 2563 server.go:1295] "Started kubelet" Apr 16 14:29:42.196551 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.196511 2563 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:29:42.196591 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.196540 2563 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:29:42.196623 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.196612 2563 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:29:42.197300 ip-10-0-141-239 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:29:42.197681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.197664 2563 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:29:42.199118 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.199104 2563 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:29:42.203120 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.203090 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:29:42.204188 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.204168 2563 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:29:42.205492 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.205471 2563 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:29:42.205492 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.205495 2563 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:29:42.205663 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.205648 2563 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:29:42.205715 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.205697 2563 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:29:42.205715 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.205705 2563 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:29:42.205912 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.205887 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 16 14:29:42.206581 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.206562 2563 factory.go:55] Registering systemd factory Apr 16 14:29:42.206677 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.206635 2563 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:29:42.207046 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.207027 2563 factory.go:153] Registering CRI-O factory Apr 16 14:29:42.207046 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.207044 2563 factory.go:223] Registration of the crio container factory successfully Apr 16 14:29:42.207170 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.205933 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-239.ec2.internal.18a6dcb724a874a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-239.ec2.internal,UID:ip-10-0-141-239.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-239.ec2.internal,},FirstTimestamp:2026-04-16 14:29:42.196384935 +0000 UTC m=+0.424114554,LastTimestamp:2026-04-16 14:29:42.196384935 +0000 UTC m=+0.424114554,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-239.ec2.internal,}" Apr 16 14:29:42.207170 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.207034 2563 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:29:42.207170 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.207129 2563 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:29:42.207170 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.207152 2563 factory.go:103] Registering Raw factory Apr 16 14:29:42.207170 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.207168 2563 manager.go:1196] Started watching for new ooms in manager Apr 16 14:29:42.208493 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.208477 2563 manager.go:319] Starting recovery of all containers Apr 16 14:29:42.213589 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.213508 2563 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-239.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 14:29:42.213687 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.213585 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 14:29:42.220137 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.220121 2563 manager.go:324] Recovery completed Apr 16 14:29:42.224708 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.224594 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:42.228104 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.228084 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:42.228177 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.228117 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:42.228177 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.228127 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:42.228674 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.228660 2563 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:29:42.228674 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.228673 2563 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:29:42.228793 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.228692 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:29:42.230411 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.230344 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-239.ec2.internal.18a6dcb7268c7506 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-239.ec2.internal,UID:ip-10-0-141-239.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-239.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-239.ec2.internal,},FirstTimestamp:2026-04-16 14:29:42.228104454 +0000 UTC m=+0.455834073,LastTimestamp:2026-04-16 14:29:42.228104454 +0000 UTC m=+0.455834073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-239.ec2.internal,}" Apr 16 14:29:42.230727 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.230674 2563 policy_none.go:49] "None policy: Start" Apr 16 14:29:42.230727 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.230701 2563 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:29:42.230727 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.230718 2563 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:29:42.239203 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.239124 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-239.ec2.internal.18a6dcb7268cb6b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-239.ec2.internal,UID:ip-10-0-141-239.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-141-239.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-141-239.ec2.internal,},FirstTimestamp:2026-04-16 14:29:42.22812127 +0000 UTC m=+0.455850889,LastTimestamp:2026-04-16 14:29:42.22812127 +0000 UTC m=+0.455850889,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-239.ec2.internal,}" Apr 16 14:29:42.244391 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.244371 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bllw2" Apr 16 14:29:42.250043 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.249974 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-239.ec2.internal.18a6dcb7268cde42 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-239.ec2.internal,UID:ip-10-0-141-239.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-141-239.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-141-239.ec2.internal,},FirstTimestamp:2026-04-16 14:29:42.228131394 +0000 UTC m=+0.455861013,LastTimestamp:2026-04-16 14:29:42.228131394 +0000 UTC m=+0.455861013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-239.ec2.internal,}" Apr 16 14:29:42.256408 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.256384 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bllw2" Apr 16 14:29:42.276199 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.276182 2563 manager.go:341] "Starting Device Plugin manager" Apr 16 14:29:42.277575 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.276231 2563 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:29:42.277575 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.276245 2563 server.go:85] "Starting device plugin registration server" Apr 16 14:29:42.277575 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.276569 2563 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:29:42.277575 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.276589 2563 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:29:42.277575 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.276754 2563 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:29:42.277575 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.276848 2563 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:29:42.277575 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.276887 2563 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:29:42.277575 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.277315 2563 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:29:42.277575 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.277353 2563 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-239.ec2.internal\" not found" Apr 16 14:29:42.333087 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.333048 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:29:42.334218 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.334202 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:29:42.334308 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.334240 2563 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:29:42.334308 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.334299 2563 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:29:42.334403 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.334309 2563 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:29:42.334403 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.334354 2563 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:29:42.337011 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.336989 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:42.377325 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.377246 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:42.378252 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.378235 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:42.378345 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.378266 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:42.378345 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.378277 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:42.378345 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.378301 2563 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.387593 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.387574 2563 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.387637 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.387597 2563 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-239.ec2.internal\": node \"ip-10-0-141-239.ec2.internal\" not found" Apr 16 14:29:42.413126 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.413091 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 16 14:29:42.435437 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.435406 2563 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal"] Apr 16 14:29:42.435585 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.435482 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:42.436475 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.436460 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:42.436553 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.436490 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:42.436553 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.436500 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:42.437624 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.437611 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:42.437752 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.437737 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.437808 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.437766 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:42.438364 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.438335 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:42.438364 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.438349 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:42.438364 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.438360 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:42.438493 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.438370 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:42.438493 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.438375 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:42.438493 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.438380 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:42.439862 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.439846 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.439915 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.439874 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:42.440543 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.440511 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:42.440657 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.440556 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:42.440657 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.440569 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:42.463251 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.463232 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-239.ec2.internal\" not found" node="ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.467538 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.467512 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-239.ec2.internal\" not found" node="ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.507347 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.507315 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d527d52245c1df0987bf67706224db7f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal\" (UID: \"d527d52245c1df0987bf67706224db7f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.507347 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.507350 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/93794f44aea94deab445c9557d98f20e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-239.ec2.internal\" (UID: \"93794f44aea94deab445c9557d98f20e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.507548 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.507366 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d527d52245c1df0987bf67706224db7f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal\" (UID: \"d527d52245c1df0987bf67706224db7f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.514044 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.514026 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 16 14:29:42.608327 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.608294 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d527d52245c1df0987bf67706224db7f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal\" (UID: \"d527d52245c1df0987bf67706224db7f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.608327 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.608327 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/93794f44aea94deab445c9557d98f20e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-239.ec2.internal\" (UID: \"93794f44aea94deab445c9557d98f20e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.608558 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.608344 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d527d52245c1df0987bf67706224db7f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal\" (UID: \"d527d52245c1df0987bf67706224db7f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.608558 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.608374 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d527d52245c1df0987bf67706224db7f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal\" (UID: \"d527d52245c1df0987bf67706224db7f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.608558 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.608378 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d527d52245c1df0987bf67706224db7f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal\" (UID: \"d527d52245c1df0987bf67706224db7f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.608558 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.608388 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/93794f44aea94deab445c9557d98f20e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-239.ec2.internal\" (UID: \"93794f44aea94deab445c9557d98f20e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.614379 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.614360 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 16 14:29:42.715256 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.715171 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 16 14:29:42.766382 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.766351 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.769691 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:42.769594 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" Apr 16 14:29:42.816292 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.816258 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 16 14:29:42.916824 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:42.916785 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 16 14:29:43.017399 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.017297 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 16 14:29:43.116916 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.116887 2563 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:29:43.117445 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.117031 2563 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:29:43.117971 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.117952 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-239.ec2.internal\" not found" Apr 16 14:29:43.172387 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.172355 2563 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:43.184592 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.184566 2563 apiserver.go:52] "Watching apiserver" Apr 16 14:29:43.193233 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.193201 2563 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:29:43.195765 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.195739 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-fvd5j","openshift-multus/multus-additional-cni-plugins-4p4tq","openshift-multus/network-metrics-daemon-9fx7w","openshift-network-diagnostics/network-check-target-9hlkg","openshift-network-operator/iptables-alerter-d9rxn","openshift-ovn-kubernetes/ovnkube-node-8wdkz","kube-system/konnectivity-agent-2cqcm","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v","openshift-cluster-node-tuning-operator/tuned-98gdh","openshift-multus/multus-z5ff6","openshift-dns/node-resolver-ktxmg"] Apr 16 14:29:43.197537 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.197504 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fvd5j" Apr 16 14:29:43.199249 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.199224 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.199453 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.199425 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:43.199569 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.199507 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:29:43.199895 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.199876 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:29:43.199975 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.199921 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:29:43.200031 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.200000 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ddxcq\"" Apr 16 14:29:43.200344 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.200326 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:43.200468 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.200382 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:29:43.200468 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.200385 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:29:43.201380 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.201361 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-d9rxn" Apr 16 14:29:43.201450 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.201420 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:29:43.201811 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.201794 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hs6n7\"" Apr 16 14:29:43.201893 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.201814 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:29:43.201893 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.201825 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:29:43.202010 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.201957 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:29:43.202126 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.202110 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:29:43.202573 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.202559 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.203172 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.203155 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:29:43.203804 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.203786 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2cqcm" Apr 16 14:29:43.204011 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.203994 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:29:43.204094 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.204000 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:29:43.204094 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.204033 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:29:43.204094 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.204080 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ktzjx\"" Apr 16 14:29:43.204976 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.204958 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:29:43.205080 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.205038 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.205133 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.205123 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" Apr 16 14:29:43.205354 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.205340 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:29:43.205404 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.205340 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:29:43.205821 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.205803 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:29:43.206137 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.206122 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.206365 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.206342 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-c7xln\"" Apr 16 14:29:43.206440 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.206343 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:29:43.207001 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.206982 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:29:43.207190 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.207177 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:29:43.207247 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.207202 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:29:43.207310 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.207290 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wfqmd\"" Apr 16 14:29:43.207665 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.207650 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.208054 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.208037 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:29:43.208293 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.208274 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:29:43.208510 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.208493 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:29:43.208885 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.208868 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:29:43.209181 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.209163 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-x9kfh\"" Apr 16 14:29:43.209243 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.209229 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:29:43.209504 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.209422 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vd8z2\"" Apr 16 14:29:43.210237 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.209843 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ktxmg" Apr 16 14:29:43.210237 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.210151 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:29:43.210577 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.210559 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-84lv2\"" Apr 16 14:29:43.210948 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.210926 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-cni-bin\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.211041 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.210961 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-var-lib-kubelet\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.211041 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.210993 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58345d0f-99ee-4b7c-85e0-26a9d8dbad5a-host-slash\") pod \"iptables-alerter-d9rxn\" (UID: \"58345d0f-99ee-4b7c-85e0-26a9d8dbad5a\") " pod="openshift-network-operator/iptables-alerter-d9rxn" Apr 16 14:29:43.211041 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211018 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d2ce07ef-ba7f-4f81-a82f-80f139286fa6-agent-certs\") pod \"konnectivity-agent-2cqcm\" (UID: \"d2ce07ef-ba7f-4f81-a82f-80f139286fa6\") " pod="kube-system/konnectivity-agent-2cqcm" Apr 16 14:29:43.211164 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211041 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.211164 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211104 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-run-openvswitch\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.211164 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211138 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-system-cni-dir\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.211279 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211162 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-env-overrides\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.211279 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211186 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/898da78a-88a7-4608-baa8-e5a6bdba777f-serviceca\") pod \"node-ca-fvd5j\" (UID: \"898da78a-88a7-4608-baa8-e5a6bdba777f\") " pod="openshift-image-registry/node-ca-fvd5j" Apr 16 14:29:43.211279 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211208 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtsvr\" (UniqueName: \"kubernetes.io/projected/898da78a-88a7-4608-baa8-e5a6bdba777f-kube-api-access-gtsvr\") pod \"node-ca-fvd5j\" (UID: \"898da78a-88a7-4608-baa8-e5a6bdba777f\") " pod="openshift-image-registry/node-ca-fvd5j" Apr 16 14:29:43.211279 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211229 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-kubelet\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.211465 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211340 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-etc-openvswitch\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.211465 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211377 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-cni-binary-copy\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.211465 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211407 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-sys-fs\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.211465 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211441 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.211679 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211468 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-node-log\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.211679 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211495 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.211679 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211543 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89qrt\" (UniqueName: \"kubernetes.io/projected/12388774-f1a3-4707-990d-ada56cd5b08c-kube-api-access-89qrt\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.211679 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211569 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-cnibin\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.211679 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211593 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.211679 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211619 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qflgf\" (UniqueName: \"kubernetes.io/projected/f6c99dab-e384-40fd-849c-ac070671e4ea-kube-api-access-qflgf\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.211679 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211642 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.211679 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211667 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-ovn-node-metrics-cert\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211689 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-run-netns\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211718 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-slash\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211741 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-var-lib-openvswitch\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211765 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-ovnkube-script-lib\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211789 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-kubernetes\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211815 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg6j7\" (UniqueName: \"kubernetes.io/projected/58345d0f-99ee-4b7c-85e0-26a9d8dbad5a-kube-api-access-jg6j7\") pod \"iptables-alerter-d9rxn\" (UID: \"58345d0f-99ee-4b7c-85e0-26a9d8dbad5a\") " pod="openshift-network-operator/iptables-alerter-d9rxn" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211840 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/898da78a-88a7-4608-baa8-e5a6bdba777f-host\") pod \"node-ca-fvd5j\" (UID: \"898da78a-88a7-4608-baa8-e5a6bdba777f\") " pod="openshift-image-registry/node-ca-fvd5j" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211862 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211885 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlz5q\" (UniqueName: \"kubernetes.io/projected/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-kube-api-access-mlz5q\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211908 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-modprobe-d\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211930 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-socket-dir\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211937 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211965 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-registration-dir\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.211987 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-cni-netd\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212009 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpgkj\" (UniqueName: \"kubernetes.io/projected/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-kube-api-access-xpgkj\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.212027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212031 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-systemd-units\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212045 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-systemd\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212053 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-d74dj\"" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212058 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-lib-modules\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212077 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcghq\" (UniqueName: \"kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq\") pod \"network-check-target-9hlkg\" (UID: \"2edb25cc-c726-4b56-8a1b-f3877bff370e\") " pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212101 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212124 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-sysctl-d\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212140 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58345d0f-99ee-4b7c-85e0-26a9d8dbad5a-iptables-alerter-script\") pod \"iptables-alerter-d9rxn\" (UID: \"58345d0f-99ee-4b7c-85e0-26a9d8dbad5a\") " pod="openshift-network-operator/iptables-alerter-d9rxn" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212155 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212174 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-run-systemd\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212217 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/12388774-f1a3-4707-990d-ada56cd5b08c-tmp\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212250 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d2ce07ef-ba7f-4f81-a82f-80f139286fa6-konnectivity-ca\") pod \"konnectivity-agent-2cqcm\" (UID: \"d2ce07ef-ba7f-4f81-a82f-80f139286fa6\") " pod="kube-system/konnectivity-agent-2cqcm" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212278 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-device-dir\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212305 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-run-ovn\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212327 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-sysconfig\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212350 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-sysctl-conf\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212384 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-run\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.212642 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212425 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/12388774-f1a3-4707-990d-ada56cd5b08c-etc-tuned\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.213338 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212448 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:29:43.213338 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212452 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-ovnkube-config\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.213338 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212492 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-host\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.213338 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212522 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs54g\" (UniqueName: \"kubernetes.io/projected/e7706545-6db6-4426-919c-bf83b5020047-kube-api-access-cs54g\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:43.213338 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212621 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-log-socket\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.213338 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212645 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-sys\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.213338 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.212669 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-os-release\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.214099 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.214076 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:29:43.214182 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.214145 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" Apr 16 14:29:43.214286 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.214268 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal"] Apr 16 14:29:43.218989 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.218972 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:29:43.227372 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.227352 2563 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:43.227830 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.227812 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:29:43.227931 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.227916 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal"] Apr 16 14:29:43.237836 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.237813 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-m2bzh" Apr 16 14:29:43.246406 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.246383 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-m2bzh" Apr 16 14:29:43.258055 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.258023 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:24:42 +0000 UTC" deadline="2027-10-22 14:19:43.798437952 +0000 UTC" Apr 16 14:29:43.258055 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.258050 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13295h50m0.540391208s" Apr 16 14:29:43.306502 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.306466 2563 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:29:43.313794 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.313622 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-run-systemd\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.313794 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.313674 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/12388774-f1a3-4707-990d-ada56cd5b08c-tmp\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.313794 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.313712 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d2ce07ef-ba7f-4f81-a82f-80f139286fa6-konnectivity-ca\") pod \"konnectivity-agent-2cqcm\" (UID: \"d2ce07ef-ba7f-4f81-a82f-80f139286fa6\") " pod="kube-system/konnectivity-agent-2cqcm" Apr 16 14:29:43.313794 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.313722 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-run-systemd\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.313794 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.313748 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-device-dir\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.314117 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.313841 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-etc-kubernetes\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.314117 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.313912 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-device-dir\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.314117 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.313899 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-run-ovn\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.314117 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.313964 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-run-ovn\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.314117 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.313976 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-sysconfig\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.314117 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314017 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-sysctl-conf\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.314117 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314041 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-run\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.314117 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314042 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-sysconfig\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.314117 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314059 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/12388774-f1a3-4707-990d-ada56cd5b08c-etc-tuned\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.314117 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314101 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-run\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314129 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-ovnkube-config\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314153 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-host\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314186 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-sysctl-conf\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314201 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-host\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314211 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cs54g\" (UniqueName: \"kubernetes.io/projected/e7706545-6db6-4426-919c-bf83b5020047-kube-api-access-cs54g\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314180 2563 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314339 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-log-socket\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314376 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-sys\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314424 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-os-release\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314439 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d2ce07ef-ba7f-4f81-a82f-80f139286fa6-konnectivity-ca\") pod \"konnectivity-agent-2cqcm\" (UID: \"d2ce07ef-ba7f-4f81-a82f-80f139286fa6\") " pod="kube-system/konnectivity-agent-2cqcm" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314467 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-var-lib-kubelet\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314522 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-cni-bin\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.314578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314539 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-sys\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314592 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-var-lib-kubelet\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314605 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-log-socket\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314619 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58345d0f-99ee-4b7c-85e0-26a9d8dbad5a-host-slash\") pod \"iptables-alerter-d9rxn\" (UID: \"58345d0f-99ee-4b7c-85e0-26a9d8dbad5a\") " pod="openshift-network-operator/iptables-alerter-d9rxn" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314685 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d2ce07ef-ba7f-4f81-a82f-80f139286fa6-agent-certs\") pod \"konnectivity-agent-2cqcm\" (UID: \"d2ce07ef-ba7f-4f81-a82f-80f139286fa6\") " pod="kube-system/konnectivity-agent-2cqcm" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314686 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-os-release\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314750 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58345d0f-99ee-4b7c-85e0-26a9d8dbad5a-host-slash\") pod \"iptables-alerter-d9rxn\" (UID: \"58345d0f-99ee-4b7c-85e0-26a9d8dbad5a\") " pod="openshift-network-operator/iptables-alerter-d9rxn" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314782 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-cni-bin\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314827 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314851 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-var-lib-kubelet\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314874 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-hostroot\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314877 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-ovnkube-config\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314906 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314910 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-run-openvswitch\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314942 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-system-cni-dir\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314969 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-run-openvswitch\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.315216 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.314985 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7c31e47-1a62-42f4-b8c1-63188895e755-cni-binary-copy\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315021 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-env-overrides\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315582 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/898da78a-88a7-4608-baa8-e5a6bdba777f-serviceca\") pod \"node-ca-fvd5j\" (UID: \"898da78a-88a7-4608-baa8-e5a6bdba777f\") " pod="openshift-image-registry/node-ca-fvd5j" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315616 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtsvr\" (UniqueName: \"kubernetes.io/projected/898da78a-88a7-4608-baa8-e5a6bdba777f-kube-api-access-gtsvr\") pod \"node-ca-fvd5j\" (UID: \"898da78a-88a7-4608-baa8-e5a6bdba777f\") " pod="openshift-image-registry/node-ca-fvd5j" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315641 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-kubelet\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315666 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-etc-openvswitch\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315694 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-cni-binary-copy\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315719 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-sys-fs\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315749 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-run-multus-certs\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315785 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315816 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-multus-socket-dir-parent\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315842 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-run-netns\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315870 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-node-log\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315903 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315930 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89qrt\" (UniqueName: \"kubernetes.io/projected/12388774-f1a3-4707-990d-ada56cd5b08c-kube-api-access-89qrt\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315956 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-cnibin\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.315977 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.316048 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316004 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qflgf\" (UniqueName: \"kubernetes.io/projected/f6c99dab-e384-40fd-849c-ac070671e4ea-kube-api-access-qflgf\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316032 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-multus-cni-dir\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316064 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316093 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-ovn-node-metrics-cert\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316122 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-run-netns\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316134 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-env-overrides\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316151 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-system-cni-dir\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316179 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-run-k8s-cni-cncf-io\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316207 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-var-lib-cni-bin\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316234 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c7c31e47-1a62-42f4-b8c1-63188895e755-multus-daemon-config\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316262 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-slash\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316290 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-var-lib-openvswitch\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316324 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-ovnkube-script-lib\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316353 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-kubernetes\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316375 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-system-cni-dir\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316385 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg6j7\" (UniqueName: \"kubernetes.io/projected/58345d0f-99ee-4b7c-85e0-26a9d8dbad5a-kube-api-access-jg6j7\") pod \"iptables-alerter-d9rxn\" (UID: \"58345d0f-99ee-4b7c-85e0-26a9d8dbad5a\") " pod="openshift-network-operator/iptables-alerter-d9rxn" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316434 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-cnibin\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.316861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316471 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5472\" (UniqueName: \"kubernetes.io/projected/c7c31e47-1a62-42f4-b8c1-63188895e755-kube-api-access-n5472\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316519 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8c240568-6c79-4c9e-af56-ea680b0f0410-hosts-file\") pod \"node-resolver-ktxmg\" (UID: \"8c240568-6c79-4c9e-af56-ea680b0f0410\") " pod="openshift-dns/node-resolver-ktxmg" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316578 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/898da78a-88a7-4608-baa8-e5a6bdba777f-host\") pod \"node-ca-fvd5j\" (UID: \"898da78a-88a7-4608-baa8-e5a6bdba777f\") " pod="openshift-image-registry/node-ca-fvd5j" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316610 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316636 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlz5q\" (UniqueName: \"kubernetes.io/projected/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-kube-api-access-mlz5q\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316837 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-modprobe-d\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316871 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-socket-dir\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316901 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-registration-dir\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316931 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-cnibin\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316959 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-cni-netd\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.316979 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/898da78a-88a7-4608-baa8-e5a6bdba777f-serviceca\") pod \"node-ca-fvd5j\" (UID: \"898da78a-88a7-4608-baa8-e5a6bdba777f\") " pod="openshift-image-registry/node-ca-fvd5j" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317003 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpgkj\" (UniqueName: \"kubernetes.io/projected/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-kube-api-access-xpgkj\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317020 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-sys-fs\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317063 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-node-log\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317097 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317199 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-var-lib-openvswitch\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317208 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317267 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/898da78a-88a7-4608-baa8-e5a6bdba777f-host\") pod \"node-ca-fvd5j\" (UID: \"898da78a-88a7-4608-baa8-e5a6bdba777f\") " pod="openshift-image-registry/node-ca-fvd5j" Apr 16 14:29:43.317655 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.317412 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.317507 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs podName:e7706545-6db6-4426-919c-bf83b5020047 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:43.817472262 +0000 UTC m=+2.045201883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs") pod "network-metrics-daemon-9fx7w" (UID: "e7706545-6db6-4426-919c-bf83b5020047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317718 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317726 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-slash\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317743 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-os-release\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317784 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-kubernetes\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317821 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-kubelet\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317872 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-etc-openvswitch\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.317961 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-modprobe-d\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318001 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-run-netns\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318050 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-ovnkube-script-lib\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318149 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-cni-netd\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318180 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-cni-binary-copy\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318215 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-registration-dir\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318295 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8c240568-6c79-4c9e-af56-ea680b0f0410-tmp-dir\") pod \"node-resolver-ktxmg\" (UID: \"8c240568-6c79-4c9e-af56-ea680b0f0410\") " pod="openshift-dns/node-resolver-ktxmg" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318330 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-socket-dir\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.318443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318381 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvwxt\" (UniqueName: \"kubernetes.io/projected/8c240568-6c79-4c9e-af56-ea680b0f0410-kube-api-access-vvwxt\") pod \"node-resolver-ktxmg\" (UID: \"8c240568-6c79-4c9e-af56-ea680b0f0410\") " pod="openshift-dns/node-resolver-ktxmg" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318432 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-systemd-units\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318464 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-systemd\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318487 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-lib-modules\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318574 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-systemd\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318568 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-systemd-units\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318607 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcghq\" (UniqueName: \"kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq\") pod \"network-check-target-9hlkg\" (UID: \"2edb25cc-c726-4b56-8a1b-f3877bff370e\") " pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318623 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318655 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-var-lib-cni-multus\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318698 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-lib-modules\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318805 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-multus-conf-dir\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318849 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318895 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318900 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-sysctl-d\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.318940 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58345d0f-99ee-4b7c-85e0-26a9d8dbad5a-iptables-alerter-script\") pod \"iptables-alerter-d9rxn\" (UID: \"58345d0f-99ee-4b7c-85e0-26a9d8dbad5a\") " pod="openshift-network-operator/iptables-alerter-d9rxn" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.319014 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.319035 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/12388774-f1a3-4707-990d-ada56cd5b08c-etc-sysctl-d\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.319227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.319185 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6c99dab-e384-40fd-849c-ac070671e4ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.320058 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.319641 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58345d0f-99ee-4b7c-85e0-26a9d8dbad5a-iptables-alerter-script\") pod \"iptables-alerter-d9rxn\" (UID: \"58345d0f-99ee-4b7c-85e0-26a9d8dbad5a\") " pod="openshift-network-operator/iptables-alerter-d9rxn" Apr 16 14:29:43.320147 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.320102 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/12388774-f1a3-4707-990d-ada56cd5b08c-etc-tuned\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.320201 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.320177 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/12388774-f1a3-4707-990d-ada56cd5b08c-tmp\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.320828 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.320802 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-ovn-node-metrics-cert\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.321404 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.321386 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d2ce07ef-ba7f-4f81-a82f-80f139286fa6-agent-certs\") pod \"konnectivity-agent-2cqcm\" (UID: \"d2ce07ef-ba7f-4f81-a82f-80f139286fa6\") " pod="kube-system/konnectivity-agent-2cqcm" Apr 16 14:29:43.330165 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.330092 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs54g\" (UniqueName: \"kubernetes.io/projected/e7706545-6db6-4426-919c-bf83b5020047-kube-api-access-cs54g\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:43.330165 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.330119 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg6j7\" (UniqueName: \"kubernetes.io/projected/58345d0f-99ee-4b7c-85e0-26a9d8dbad5a-kube-api-access-jg6j7\") pod \"iptables-alerter-d9rxn\" (UID: \"58345d0f-99ee-4b7c-85e0-26a9d8dbad5a\") " pod="openshift-network-operator/iptables-alerter-d9rxn" Apr 16 14:29:43.330370 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.330351 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qflgf\" (UniqueName: \"kubernetes.io/projected/f6c99dab-e384-40fd-849c-ac070671e4ea-kube-api-access-qflgf\") pod \"aws-ebs-csi-driver-node-c2z6v\" (UID: \"f6c99dab-e384-40fd-849c-ac070671e4ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.331711 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.331689 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtsvr\" (UniqueName: \"kubernetes.io/projected/898da78a-88a7-4608-baa8-e5a6bdba777f-kube-api-access-gtsvr\") pod \"node-ca-fvd5j\" (UID: \"898da78a-88a7-4608-baa8-e5a6bdba777f\") " pod="openshift-image-registry/node-ca-fvd5j" Apr 16 14:29:43.331935 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.331920 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpgkj\" (UniqueName: \"kubernetes.io/projected/e891483d-413b-4b2d-a2d0-ee6b42f8ccbf-kube-api-access-xpgkj\") pod \"multus-additional-cni-plugins-4p4tq\" (UID: \"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf\") " pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.331979 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.331962 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89qrt\" (UniqueName: \"kubernetes.io/projected/12388774-f1a3-4707-990d-ada56cd5b08c-kube-api-access-89qrt\") pod \"tuned-98gdh\" (UID: \"12388774-f1a3-4707-990d-ada56cd5b08c\") " pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.337718 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.337694 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:43.337806 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.337722 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:43.337806 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.337736 2563 projected.go:194] Error preparing data for projected volume kube-api-access-bcghq for pod openshift-network-diagnostics/network-check-target-9hlkg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:43.337877 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.337819 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq podName:2edb25cc-c726-4b56-8a1b-f3877bff370e nodeName:}" failed. No retries permitted until 2026-04-16 14:29:43.837800846 +0000 UTC m=+2.065530472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bcghq" (UniqueName: "kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq") pod "network-check-target-9hlkg" (UID: "2edb25cc-c726-4b56-8a1b-f3877bff370e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:43.338464 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.338444 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlz5q\" (UniqueName: \"kubernetes.io/projected/29c3ac6b-dd94-4f4b-88ca-cf83af0046d3-kube-api-access-mlz5q\") pod \"ovnkube-node-8wdkz\" (UID: \"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.402789 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:43.402742 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93794f44aea94deab445c9557d98f20e.slice/crio-0f38b2ad0ba73f1eb1f38663608d3fd81cc77cd3fc81304d080a63dae6c50e64 WatchSource:0}: Error finding container 0f38b2ad0ba73f1eb1f38663608d3fd81cc77cd3fc81304d080a63dae6c50e64: Status 404 returned error can't find the container with id 0f38b2ad0ba73f1eb1f38663608d3fd81cc77cd3fc81304d080a63dae6c50e64 Apr 16 14:29:43.403014 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:43.402995 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd527d52245c1df0987bf67706224db7f.slice/crio-2d51c912079c2627b639baf83c65c57a94f8d2298898a2f3bfcf573d9b21de38 WatchSource:0}: Error finding container 2d51c912079c2627b639baf83c65c57a94f8d2298898a2f3bfcf573d9b21de38: Status 404 returned error can't find the container with id 2d51c912079c2627b639baf83c65c57a94f8d2298898a2f3bfcf573d9b21de38 Apr 16 14:29:43.406220 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.406205 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:29:43.420154 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420130 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-multus-cni-dir\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420154 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420159 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-system-cni-dir\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420326 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420176 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-run-k8s-cni-cncf-io\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420326 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420191 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-var-lib-cni-bin\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420326 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420207 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c7c31e47-1a62-42f4-b8c1-63188895e755-multus-daemon-config\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420326 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420240 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5472\" (UniqueName: \"kubernetes.io/projected/c7c31e47-1a62-42f4-b8c1-63188895e755-kube-api-access-n5472\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420326 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420263 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-run-k8s-cni-cncf-io\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420326 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420265 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-system-cni-dir\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420326 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420264 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8c240568-6c79-4c9e-af56-ea680b0f0410-hosts-file\") pod \"node-resolver-ktxmg\" (UID: \"8c240568-6c79-4c9e-af56-ea680b0f0410\") " pod="openshift-dns/node-resolver-ktxmg" Apr 16 14:29:43.420326 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420319 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8c240568-6c79-4c9e-af56-ea680b0f0410-hosts-file\") pod \"node-resolver-ktxmg\" (UID: \"8c240568-6c79-4c9e-af56-ea680b0f0410\") " pod="openshift-dns/node-resolver-ktxmg" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420330 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-multus-cni-dir\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420377 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-cnibin\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420392 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-var-lib-cni-bin\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420452 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-cnibin\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420411 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-os-release\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420493 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8c240568-6c79-4c9e-af56-ea680b0f0410-tmp-dir\") pod \"node-resolver-ktxmg\" (UID: \"8c240568-6c79-4c9e-af56-ea680b0f0410\") " pod="openshift-dns/node-resolver-ktxmg" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420518 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-os-release\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420522 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvwxt\" (UniqueName: \"kubernetes.io/projected/8c240568-6c79-4c9e-af56-ea680b0f0410-kube-api-access-vvwxt\") pod \"node-resolver-ktxmg\" (UID: \"8c240568-6c79-4c9e-af56-ea680b0f0410\") " pod="openshift-dns/node-resolver-ktxmg" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420604 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-var-lib-cni-multus\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420626 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-multus-conf-dir\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420660 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-var-lib-cni-multus\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420664 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-etc-kubernetes\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.420718 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420696 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-etc-kubernetes\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420728 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-var-lib-kubelet\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420663 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-multus-conf-dir\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420760 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-hostroot\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420804 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-hostroot\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420805 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8c240568-6c79-4c9e-af56-ea680b0f0410-tmp-dir\") pod \"node-resolver-ktxmg\" (UID: \"8c240568-6c79-4c9e-af56-ea680b0f0410\") " pod="openshift-dns/node-resolver-ktxmg" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420819 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7c31e47-1a62-42f4-b8c1-63188895e755-cni-binary-copy\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420762 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-var-lib-kubelet\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420858 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-run-multus-certs\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420880 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-multus-socket-dir-parent\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420886 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-run-multus-certs\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420900 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-run-netns\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420931 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-multus-socket-dir-parent\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420959 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c7c31e47-1a62-42f4-b8c1-63188895e755-multus-daemon-config\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.420980 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7c31e47-1a62-42f4-b8c1-63188895e755-host-run-netns\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.421257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.421214 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7c31e47-1a62-42f4-b8c1-63188895e755-cni-binary-copy\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.432666 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.432636 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5472\" (UniqueName: \"kubernetes.io/projected/c7c31e47-1a62-42f4-b8c1-63188895e755-kube-api-access-n5472\") pod \"multus-z5ff6\" (UID: \"c7c31e47-1a62-42f4-b8c1-63188895e755\") " pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.432666 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.432652 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvwxt\" (UniqueName: \"kubernetes.io/projected/8c240568-6c79-4c9e-af56-ea680b0f0410-kube-api-access-vvwxt\") pod \"node-resolver-ktxmg\" (UID: \"8c240568-6c79-4c9e-af56-ea680b0f0410\") " pod="openshift-dns/node-resolver-ktxmg" Apr 16 14:29:43.518715 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.518685 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fvd5j" Apr 16 14:29:43.525309 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:43.525278 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod898da78a_88a7_4608_baa8_e5a6bdba777f.slice/crio-011ca368cb92d57074c2c0d2a97bb713d853f44adf9fda26fdd9be898091b94d WatchSource:0}: Error finding container 011ca368cb92d57074c2c0d2a97bb713d853f44adf9fda26fdd9be898091b94d: Status 404 returned error can't find the container with id 011ca368cb92d57074c2c0d2a97bb713d853f44adf9fda26fdd9be898091b94d Apr 16 14:29:43.542075 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.542044 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4p4tq" Apr 16 14:29:43.548757 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:43.548732 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode891483d_413b_4b2d_a2d0_ee6b42f8ccbf.slice/crio-6f69888b7da0c71717782348476739be25a6c784d345dd5511dd2a3f58eb7590 WatchSource:0}: Error finding container 6f69888b7da0c71717782348476739be25a6c784d345dd5511dd2a3f58eb7590: Status 404 returned error can't find the container with id 6f69888b7da0c71717782348476739be25a6c784d345dd5511dd2a3f58eb7590 Apr 16 14:29:43.549652 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.549634 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-d9rxn" Apr 16 14:29:43.555721 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:43.555699 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58345d0f_99ee_4b7c_85e0_26a9d8dbad5a.slice/crio-8f031aefb292f63bedb77f03f964bfe53f1dfd49d91a8b6faed4f0be2906848d WatchSource:0}: Error finding container 8f031aefb292f63bedb77f03f964bfe53f1dfd49d91a8b6faed4f0be2906848d: Status 404 returned error can't find the container with id 8f031aefb292f63bedb77f03f964bfe53f1dfd49d91a8b6faed4f0be2906848d Apr 16 14:29:43.577174 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.577152 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:29:43.584396 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:43.584371 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c3ac6b_dd94_4f4b_88ca_cf83af0046d3.slice/crio-c62587359177eaf06df0cc9327e6f92ff3908662730ca52b107f616cbbbad09f WatchSource:0}: Error finding container c62587359177eaf06df0cc9327e6f92ff3908662730ca52b107f616cbbbad09f: Status 404 returned error can't find the container with id c62587359177eaf06df0cc9327e6f92ff3908662730ca52b107f616cbbbad09f Apr 16 14:29:43.595468 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.595445 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2cqcm" Apr 16 14:29:43.602032 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:43.602009 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2ce07ef_ba7f_4f81_a82f_80f139286fa6.slice/crio-c8c15265c31c768544c8e148a96edf4d6cf3e3437a9f3e21952f3e6ebe29a02f WatchSource:0}: Error finding container c8c15265c31c768544c8e148a96edf4d6cf3e3437a9f3e21952f3e6ebe29a02f: Status 404 returned error can't find the container with id c8c15265c31c768544c8e148a96edf4d6cf3e3437a9f3e21952f3e6ebe29a02f Apr 16 14:29:43.607352 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.607286 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" Apr 16 14:29:43.615079 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:43.615055 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c99dab_e384_40fd_849c_ac070671e4ea.slice/crio-a96b8d732d9292a433839602d39a29925d58d0391616e0b50682d65352dc1cad WatchSource:0}: Error finding container a96b8d732d9292a433839602d39a29925d58d0391616e0b50682d65352dc1cad: Status 404 returned error can't find the container with id a96b8d732d9292a433839602d39a29925d58d0391616e0b50682d65352dc1cad Apr 16 14:29:43.617246 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.617175 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-98gdh" Apr 16 14:29:43.623080 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:43.623057 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12388774_f1a3_4707_990d_ada56cd5b08c.slice/crio-a0ac213d6a4d36cfc468b038660b51c346e571dee13e586b0826e70708a49bf3 WatchSource:0}: Error finding container a0ac213d6a4d36cfc468b038660b51c346e571dee13e586b0826e70708a49bf3: Status 404 returned error can't find the container with id a0ac213d6a4d36cfc468b038660b51c346e571dee13e586b0826e70708a49bf3 Apr 16 14:29:43.646722 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.646691 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z5ff6" Apr 16 14:29:43.652351 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.652334 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ktxmg" Apr 16 14:29:43.653746 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:43.653722 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7c31e47_1a62_42f4_b8c1_63188895e755.slice/crio-228ac5e3fc40dc29322cf6d27a09180db52ed03a66b32d7d79f51351fcf1f20a WatchSource:0}: Error finding container 228ac5e3fc40dc29322cf6d27a09180db52ed03a66b32d7d79f51351fcf1f20a: Status 404 returned error can't find the container with id 228ac5e3fc40dc29322cf6d27a09180db52ed03a66b32d7d79f51351fcf1f20a Apr 16 14:29:43.661357 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:29:43.661331 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c240568_6c79_4c9e_af56_ea680b0f0410.slice/crio-ff262c5aae0a3390f9be1e12a7d1f8908e2c84d24acd6117284e8e522e07e299 WatchSource:0}: Error finding container ff262c5aae0a3390f9be1e12a7d1f8908e2c84d24acd6117284e8e522e07e299: Status 404 returned error can't find the container with id ff262c5aae0a3390f9be1e12a7d1f8908e2c84d24acd6117284e8e522e07e299 Apr 16 14:29:43.761006 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.760970 2563 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:43.823825 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.823733 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:43.823991 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.823906 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:43.823991 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.823972 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs podName:e7706545-6db6-4426-919c-bf83b5020047 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:44.823954279 +0000 UTC m=+3.051683941 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs") pod "network-metrics-daemon-9fx7w" (UID: "e7706545-6db6-4426-919c-bf83b5020047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:43.925068 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:43.925020 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcghq\" (UniqueName: \"kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq\") pod \"network-check-target-9hlkg\" (UID: \"2edb25cc-c726-4b56-8a1b-f3877bff370e\") " pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:43.925357 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.925338 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:43.925440 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.925402 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:43.925496 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.925444 2563 projected.go:194] Error preparing data for projected volume kube-api-access-bcghq for pod openshift-network-diagnostics/network-check-target-9hlkg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:43.925566 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:43.925512 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq podName:2edb25cc-c726-4b56-8a1b-f3877bff370e nodeName:}" failed. No retries permitted until 2026-04-16 14:29:44.92548883 +0000 UTC m=+3.153218443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-bcghq" (UniqueName: "kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq") pod "network-check-target-9hlkg" (UID: "2edb25cc-c726-4b56-8a1b-f3877bff370e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:44.219145 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.219034 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:44.248033 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.247993 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:24:43 +0000 UTC" deadline="2027-12-23 09:24:54.962633612 +0000 UTC" Apr 16 14:29:44.248033 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.248030 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14778h55m10.71460744s" Apr 16 14:29:44.337713 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.337682 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:44.337886 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:44.337865 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:29:44.338444 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.338411 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:44.338554 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:44.338505 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:29:44.355434 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.355378 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ktxmg" event={"ID":"8c240568-6c79-4c9e-af56-ea680b0f0410","Type":"ContainerStarted","Data":"ff262c5aae0a3390f9be1e12a7d1f8908e2c84d24acd6117284e8e522e07e299"} Apr 16 14:29:44.361330 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.361296 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-98gdh" event={"ID":"12388774-f1a3-4707-990d-ada56cd5b08c","Type":"ContainerStarted","Data":"a0ac213d6a4d36cfc468b038660b51c346e571dee13e586b0826e70708a49bf3"} Apr 16 14:29:44.377811 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.377766 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" event={"ID":"d527d52245c1df0987bf67706224db7f","Type":"ContainerStarted","Data":"2d51c912079c2627b639baf83c65c57a94f8d2298898a2f3bfcf573d9b21de38"} Apr 16 14:29:44.390105 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.390066 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" event={"ID":"93794f44aea94deab445c9557d98f20e","Type":"ContainerStarted","Data":"0f38b2ad0ba73f1eb1f38663608d3fd81cc77cd3fc81304d080a63dae6c50e64"} Apr 16 14:29:44.403752 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.403712 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5ff6" event={"ID":"c7c31e47-1a62-42f4-b8c1-63188895e755","Type":"ContainerStarted","Data":"228ac5e3fc40dc29322cf6d27a09180db52ed03a66b32d7d79f51351fcf1f20a"} Apr 16 14:29:44.419189 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.419153 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" event={"ID":"f6c99dab-e384-40fd-849c-ac070671e4ea","Type":"ContainerStarted","Data":"a96b8d732d9292a433839602d39a29925d58d0391616e0b50682d65352dc1cad"} Apr 16 14:29:44.428250 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.428208 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2cqcm" event={"ID":"d2ce07ef-ba7f-4f81-a82f-80f139286fa6","Type":"ContainerStarted","Data":"c8c15265c31c768544c8e148a96edf4d6cf3e3437a9f3e21952f3e6ebe29a02f"} Apr 16 14:29:44.444352 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.444240 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" event={"ID":"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3","Type":"ContainerStarted","Data":"c62587359177eaf06df0cc9327e6f92ff3908662730ca52b107f616cbbbad09f"} Apr 16 14:29:44.459829 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.459736 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-d9rxn" event={"ID":"58345d0f-99ee-4b7c-85e0-26a9d8dbad5a","Type":"ContainerStarted","Data":"8f031aefb292f63bedb77f03f964bfe53f1dfd49d91a8b6faed4f0be2906848d"} Apr 16 14:29:44.468066 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.468021 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4p4tq" event={"ID":"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf","Type":"ContainerStarted","Data":"6f69888b7da0c71717782348476739be25a6c784d345dd5511dd2a3f58eb7590"} Apr 16 14:29:44.492785 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.492693 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fvd5j" event={"ID":"898da78a-88a7-4608-baa8-e5a6bdba777f","Type":"ContainerStarted","Data":"011ca368cb92d57074c2c0d2a97bb713d853f44adf9fda26fdd9be898091b94d"} Apr 16 14:29:44.836393 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.836298 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:44.836670 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:44.836474 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:44.836670 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:44.836563 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs podName:e7706545-6db6-4426-919c-bf83b5020047 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:46.836542147 +0000 UTC m=+5.064271769 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs") pod "network-metrics-daemon-9fx7w" (UID: "e7706545-6db6-4426-919c-bf83b5020047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:44.937150 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.937008 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcghq\" (UniqueName: \"kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq\") pod \"network-check-target-9hlkg\" (UID: \"2edb25cc-c726-4b56-8a1b-f3877bff370e\") " pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:44.937340 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:44.937178 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:44.937340 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:44.937197 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:44.937340 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:44.937212 2563 projected.go:194] Error preparing data for projected volume kube-api-access-bcghq for pod openshift-network-diagnostics/network-check-target-9hlkg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:44.937340 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:44.937271 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq podName:2edb25cc-c726-4b56-8a1b-f3877bff370e nodeName:}" failed. No retries permitted until 2026-04-16 14:29:46.937251979 +0000 UTC m=+5.164981589 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-bcghq" (UniqueName: "kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq") pod "network-check-target-9hlkg" (UID: "2edb25cc-c726-4b56-8a1b-f3877bff370e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:44.944226 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.944193 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-x5q8k"] Apr 16 14:29:44.951754 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:44.951724 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:44.951907 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:44.951815 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:29:45.037988 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:45.037891 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-kubelet-config\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:45.037988 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:45.037938 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:45.038258 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:45.038037 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-dbus\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:45.139189 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:45.138934 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-kubelet-config\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:45.139189 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:45.138983 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:45.139189 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:45.139029 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-dbus\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:45.139452 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:45.139235 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-dbus\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:45.139452 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:45.139311 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-kubelet-config\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:45.139452 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:45.139402 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:45.140057 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:45.139463 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret podName:fd31aa30-3e27-4c57-ae7d-843fa27b25d3 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:45.639443657 +0000 UTC m=+3.867173268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret") pod "global-pull-secret-syncer-x5q8k" (UID: "fd31aa30-3e27-4c57-ae7d-843fa27b25d3") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:45.248712 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:45.248668 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:24:43 +0000 UTC" deadline="2027-11-01 17:51:02.665887548 +0000 UTC" Apr 16 14:29:45.248712 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:45.248710 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13539h21m17.417181744s" Apr 16 14:29:45.645620 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:45.645042 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:45.645620 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:45.645192 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:45.645620 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:45.645271 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret podName:fd31aa30-3e27-4c57-ae7d-843fa27b25d3 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:46.645251867 +0000 UTC m=+4.872981478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret") pod "global-pull-secret-syncer-x5q8k" (UID: "fd31aa30-3e27-4c57-ae7d-843fa27b25d3") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:46.336861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:46.334600 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:46.336861 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:46.334728 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:29:46.336861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:46.335184 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:46.336861 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:46.335283 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:29:46.336861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:46.335432 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:46.336861 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:46.335511 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:29:46.654781 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:46.654692 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:46.654962 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:46.654863 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:46.655023 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:46.655013 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret podName:fd31aa30-3e27-4c57-ae7d-843fa27b25d3 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:48.654934014 +0000 UTC m=+6.882663644 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret") pod "global-pull-secret-syncer-x5q8k" (UID: "fd31aa30-3e27-4c57-ae7d-843fa27b25d3") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:46.857482 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:46.856866 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:46.857482 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:46.857072 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:46.857482 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:46.857144 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs podName:e7706545-6db6-4426-919c-bf83b5020047 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:50.85712271 +0000 UTC m=+9.084852318 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs") pod "network-metrics-daemon-9fx7w" (UID: "e7706545-6db6-4426-919c-bf83b5020047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:46.958183 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:46.958097 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcghq\" (UniqueName: \"kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq\") pod \"network-check-target-9hlkg\" (UID: \"2edb25cc-c726-4b56-8a1b-f3877bff370e\") " pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:46.958340 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:46.958259 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:46.958340 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:46.958277 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:46.958340 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:46.958289 2563 projected.go:194] Error preparing data for projected volume kube-api-access-bcghq for pod openshift-network-diagnostics/network-check-target-9hlkg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:46.958458 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:46.958344 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq podName:2edb25cc-c726-4b56-8a1b-f3877bff370e nodeName:}" failed. No retries permitted until 2026-04-16 14:29:50.958326965 +0000 UTC m=+9.186056593 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-bcghq" (UniqueName: "kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq") pod "network-check-target-9hlkg" (UID: "2edb25cc-c726-4b56-8a1b-f3877bff370e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:48.335291 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:48.335252 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:48.335798 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:48.335257 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:48.335798 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:48.335391 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:29:48.335798 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:48.335254 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:48.335798 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:48.335521 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:29:48.335798 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:48.335642 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:29:48.673088 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:48.672951 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:48.673259 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:48.673124 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:48.673259 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:48.673207 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret podName:fd31aa30-3e27-4c57-ae7d-843fa27b25d3 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:52.6731853 +0000 UTC m=+10.900914908 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret") pod "global-pull-secret-syncer-x5q8k" (UID: "fd31aa30-3e27-4c57-ae7d-843fa27b25d3") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:50.335637 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:50.335153 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:50.335637 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:50.335173 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:50.335637 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:50.335179 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:50.335637 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:50.335299 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:29:50.337397 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:50.337318 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:29:50.341401 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:50.337167 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:29:50.894020 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:50.893984 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:50.894201 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:50.894187 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:50.894263 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:50.894236 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs podName:e7706545-6db6-4426-919c-bf83b5020047 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:58.894222235 +0000 UTC m=+17.121951841 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs") pod "network-metrics-daemon-9fx7w" (UID: "e7706545-6db6-4426-919c-bf83b5020047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:50.994820 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:50.994784 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcghq\" (UniqueName: \"kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq\") pod \"network-check-target-9hlkg\" (UID: \"2edb25cc-c726-4b56-8a1b-f3877bff370e\") " pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:50.994992 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:50.994939 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:50.994992 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:50.994958 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:50.994992 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:50.994971 2563 projected.go:194] Error preparing data for projected volume kube-api-access-bcghq for pod openshift-network-diagnostics/network-check-target-9hlkg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:50.995155 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:50.995032 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq podName:2edb25cc-c726-4b56-8a1b-f3877bff370e nodeName:}" failed. No retries permitted until 2026-04-16 14:29:58.995014256 +0000 UTC m=+17.222743887 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-bcghq" (UniqueName: "kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq") pod "network-check-target-9hlkg" (UID: "2edb25cc-c726-4b56-8a1b-f3877bff370e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:52.335602 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:52.335568 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:52.335602 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:52.335590 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:52.335602 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:52.335563 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:52.336129 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:52.335684 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:29:52.336129 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:52.335767 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:29:52.336129 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:52.335864 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:29:52.707844 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:52.707670 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:52.707844 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:52.707833 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:52.708041 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:52.707916 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret podName:fd31aa30-3e27-4c57-ae7d-843fa27b25d3 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:00.707893375 +0000 UTC m=+18.935622991 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret") pod "global-pull-secret-syncer-x5q8k" (UID: "fd31aa30-3e27-4c57-ae7d-843fa27b25d3") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:54.337828 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:54.337796 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:54.338255 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:54.337797 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:54.338255 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:54.337929 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:29:54.338255 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:54.337797 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:54.338255 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:54.337996 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:29:54.338255 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:54.338089 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:29:56.334980 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:56.334704 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:56.335419 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:56.334785 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:56.335419 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:56.335084 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:29:56.335419 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:56.335171 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:29:56.335419 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:56.334799 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:56.335419 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:56.335298 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:29:58.335196 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:58.335154 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:58.335196 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:58.335192 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:29:58.335744 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:58.335287 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:29:58.335744 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:58.335404 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:29:58.335744 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:58.335465 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:58.335744 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:58.335564 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:29:58.952653 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:58.952611 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:29:58.952926 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:58.952828 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:58.952926 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:58.952903 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs podName:e7706545-6db6-4426-919c-bf83b5020047 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:14.952884723 +0000 UTC m=+33.180614332 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs") pod "network-metrics-daemon-9fx7w" (UID: "e7706545-6db6-4426-919c-bf83b5020047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:59.053606 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:29:59.053570 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcghq\" (UniqueName: \"kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq\") pod \"network-check-target-9hlkg\" (UID: \"2edb25cc-c726-4b56-8a1b-f3877bff370e\") " pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:29:59.053818 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:59.053771 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:59.053818 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:59.053801 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:59.053818 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:59.053815 2563 projected.go:194] Error preparing data for projected volume kube-api-access-bcghq for pod openshift-network-diagnostics/network-check-target-9hlkg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:59.053957 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:29:59.053869 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq podName:2edb25cc-c726-4b56-8a1b-f3877bff370e nodeName:}" failed. No retries permitted until 2026-04-16 14:30:15.053854704 +0000 UTC m=+33.281584318 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-bcghq" (UniqueName: "kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq") pod "network-check-target-9hlkg" (UID: "2edb25cc-c726-4b56-8a1b-f3877bff370e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:30:00.335405 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:00.335366 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:00.335870 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:00.335366 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:30:00.335870 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:00.335494 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:30:00.335870 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:00.335366 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:00.335870 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:00.335618 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:30:00.335870 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:00.335715 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:30:00.769013 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:00.768880 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:00.769168 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:00.769045 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:30:00.769168 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:00.769124 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret podName:fd31aa30-3e27-4c57-ae7d-843fa27b25d3 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:16.769104164 +0000 UTC m=+34.996833774 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret") pod "global-pull-secret-syncer-x5q8k" (UID: "fd31aa30-3e27-4c57-ae7d-843fa27b25d3") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:30:02.335963 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.335663 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:02.336325 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.335769 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:30:02.336325 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:02.335992 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:30:02.336325 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:02.336133 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:30:02.336325 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.336174 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:02.336325 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:02.336252 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:30:02.544591 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.544564 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:30:02.545386 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.545352 2563 generic.go:358] "Generic (PLEG): container finished" podID="29c3ac6b-dd94-4f4b-88ca-cf83af0046d3" containerID="8876253e6a3b4ff27cd05e6cdd29954cb5f7662a52edf53956eeb836545ed7b5" exitCode=1 Apr 16 14:30:02.545512 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.545435 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" event={"ID":"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3","Type":"ContainerStarted","Data":"13f7f3df9aa3109501e693113ac3a58094540c19ffa3bf7f4229acfe52911800"} Apr 16 14:30:02.545512 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.545490 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" event={"ID":"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3","Type":"ContainerStarted","Data":"8e5bbaa088cdfcf4cbdaba6dc429d3373d6c38598399a2aa68e34b8b493159d8"} Apr 16 14:30:02.545512 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.545500 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" event={"ID":"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3","Type":"ContainerStarted","Data":"1922b0a970fd9a801ad3494900dd6f9adf45fea5e6299df1621049463718a9b2"} Apr 16 14:30:02.545512 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.545508 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" event={"ID":"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3","Type":"ContainerDied","Data":"8876253e6a3b4ff27cd05e6cdd29954cb5f7662a52edf53956eeb836545ed7b5"} Apr 16 14:30:02.545767 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.545519 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" event={"ID":"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3","Type":"ContainerStarted","Data":"52acbf30e4b1258e825c97532812c05a6221e9f9aeeda3df63b6266c93750249"} Apr 16 14:30:02.547323 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.547294 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-98gdh" event={"ID":"12388774-f1a3-4707-990d-ada56cd5b08c","Type":"ContainerStarted","Data":"b30707b56ead564cc2d8980142dd6d33b0e39324a04079f228e0dc55c92b1615"} Apr 16 14:30:02.552843 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.552737 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" event={"ID":"93794f44aea94deab445c9557d98f20e","Type":"ContainerStarted","Data":"ae8869a245b461e35ff3bf5dd65b41212e20c3702503e950a7bbcf1ad4aa74c7"} Apr 16 14:30:02.556857 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.556830 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z5ff6" event={"ID":"c7c31e47-1a62-42f4-b8c1-63188895e755","Type":"ContainerStarted","Data":"9f254d4da7f26950c1f6349287e290fcaba285929f6d68c10f7b2c8534dc67cd"} Apr 16 14:30:02.564857 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.564805 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-98gdh" podStartSLOduration=2.312003457 podStartE2EDuration="20.564789558s" podCreationTimestamp="2026-04-16 14:29:42 +0000 UTC" firstStartedPulling="2026-04-16 14:29:43.624677484 +0000 UTC m=+1.852407090" lastFinishedPulling="2026-04-16 14:30:01.877463568 +0000 UTC m=+20.105193191" observedRunningTime="2026-04-16 14:30:02.563678177 +0000 UTC m=+20.791407805" watchObservedRunningTime="2026-04-16 14:30:02.564789558 +0000 UTC m=+20.792519192" Apr 16 14:30:02.583420 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.583363 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-z5ff6" podStartSLOduration=2.36314892 podStartE2EDuration="20.583348148s" podCreationTimestamp="2026-04-16 14:29:42 +0000 UTC" firstStartedPulling="2026-04-16 14:29:43.657023347 +0000 UTC m=+1.884752962" lastFinishedPulling="2026-04-16 14:30:01.877222579 +0000 UTC m=+20.104952190" observedRunningTime="2026-04-16 14:30:02.582851059 +0000 UTC m=+20.810580688" watchObservedRunningTime="2026-04-16 14:30:02.583348148 +0000 UTC m=+20.811077775" Apr 16 14:30:02.599015 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:02.598962 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-239.ec2.internal" podStartSLOduration=19.598946704 podStartE2EDuration="19.598946704s" podCreationTimestamp="2026-04-16 14:29:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:02.598491397 +0000 UTC m=+20.826221027" watchObservedRunningTime="2026-04-16 14:30:02.598946704 +0000 UTC m=+20.826676332" Apr 16 14:30:03.561066 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.560813 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" event={"ID":"f6c99dab-e384-40fd-849c-ac070671e4ea","Type":"ContainerStarted","Data":"24d3355d07ac8b6a58722b9b5bb2ce2515342bf19597035001e23dd2f6fd1bff"} Apr 16 14:30:03.562326 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.562292 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2cqcm" event={"ID":"d2ce07ef-ba7f-4f81-a82f-80f139286fa6","Type":"ContainerStarted","Data":"033a54462c4a1bd6f1ae81f23abc0a075ff28e33a26c218feff97755cd815029"} Apr 16 14:30:03.565110 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.565084 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:30:03.565514 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.565485 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" event={"ID":"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3","Type":"ContainerStarted","Data":"acd480c8140204d858eb9136989faeadae4dce975fa4330563ad17f3b571ff3a"} Apr 16 14:30:03.566931 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.566903 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-d9rxn" event={"ID":"58345d0f-99ee-4b7c-85e0-26a9d8dbad5a","Type":"ContainerStarted","Data":"d48618fa87c6ff890e1c66e548ebbae64a38b4b41a8ef0fa54f34b589ccf7112"} Apr 16 14:30:03.568280 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.568258 2563 generic.go:358] "Generic (PLEG): container finished" podID="e891483d-413b-4b2d-a2d0-ee6b42f8ccbf" containerID="21f2e55b3b0cc6e1637c77baa87f893d921bcd2a64d872c63b2e3202c1973d0f" exitCode=0 Apr 16 14:30:03.568378 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.568333 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4p4tq" event={"ID":"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf","Type":"ContainerDied","Data":"21f2e55b3b0cc6e1637c77baa87f893d921bcd2a64d872c63b2e3202c1973d0f"} Apr 16 14:30:03.569668 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.569642 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fvd5j" event={"ID":"898da78a-88a7-4608-baa8-e5a6bdba777f","Type":"ContainerStarted","Data":"1890e14cfc57d2d4f5f5fa19097c311309b42a758d40c016566c8f2ec231b5d9"} Apr 16 14:30:03.571046 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.571017 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ktxmg" event={"ID":"8c240568-6c79-4c9e-af56-ea680b0f0410","Type":"ContainerStarted","Data":"cdb8e789daae94c206f076c98e820c4b917752ee2e176f26b5c0e78435635c69"} Apr 16 14:30:03.572497 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.572472 2563 generic.go:358] "Generic (PLEG): container finished" podID="d527d52245c1df0987bf67706224db7f" containerID="ed4db132fc537f7f90ebfce5a15f9daebd134108a6a0ed83e2dca6c366dbe85d" exitCode=0 Apr 16 14:30:03.572594 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.572582 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" event={"ID":"d527d52245c1df0987bf67706224db7f","Type":"ContainerDied","Data":"ed4db132fc537f7f90ebfce5a15f9daebd134108a6a0ed83e2dca6c366dbe85d"} Apr 16 14:30:03.577024 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.576965 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2cqcm" podStartSLOduration=3.30533014 podStartE2EDuration="21.576950602s" podCreationTimestamp="2026-04-16 14:29:42 +0000 UTC" firstStartedPulling="2026-04-16 14:29:43.603553663 +0000 UTC m=+1.831283273" lastFinishedPulling="2026-04-16 14:30:01.875174115 +0000 UTC m=+20.102903735" observedRunningTime="2026-04-16 14:30:03.576324037 +0000 UTC m=+21.804053666" watchObservedRunningTime="2026-04-16 14:30:03.576950602 +0000 UTC m=+21.804680232" Apr 16 14:30:03.624975 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.624536 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ktxmg" podStartSLOduration=3.410175321 podStartE2EDuration="21.624504233s" podCreationTimestamp="2026-04-16 14:29:42 +0000 UTC" firstStartedPulling="2026-04-16 14:29:43.662808297 +0000 UTC m=+1.890537902" lastFinishedPulling="2026-04-16 14:30:01.877137206 +0000 UTC m=+20.104866814" observedRunningTime="2026-04-16 14:30:03.623796607 +0000 UTC m=+21.851526266" watchObservedRunningTime="2026-04-16 14:30:03.624504233 +0000 UTC m=+21.852233862" Apr 16 14:30:03.636694 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.636628 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fvd5j" podStartSLOduration=3.285075323 podStartE2EDuration="21.63661117s" podCreationTimestamp="2026-04-16 14:29:42 +0000 UTC" firstStartedPulling="2026-04-16 14:29:43.526791994 +0000 UTC m=+1.754521601" lastFinishedPulling="2026-04-16 14:30:01.878327827 +0000 UTC m=+20.106057448" observedRunningTime="2026-04-16 14:30:03.63647105 +0000 UTC m=+21.864200679" watchObservedRunningTime="2026-04-16 14:30:03.63661117 +0000 UTC m=+21.864340800" Apr 16 14:30:03.649510 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.649457 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-d9rxn" podStartSLOduration=3.475435735 podStartE2EDuration="21.649439721s" podCreationTimestamp="2026-04-16 14:29:42 +0000 UTC" firstStartedPulling="2026-04-16 14:29:43.557210279 +0000 UTC m=+1.784939885" lastFinishedPulling="2026-04-16 14:30:01.731214264 +0000 UTC m=+19.958943871" observedRunningTime="2026-04-16 14:30:03.649331688 +0000 UTC m=+21.877061317" watchObservedRunningTime="2026-04-16 14:30:03.649439721 +0000 UTC m=+21.877169350" Apr 16 14:30:03.650774 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:03.650742 2563 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:30:04.287384 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:04.287243 2563 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:30:03.650761856Z","UUID":"4985306f-9675-47f2-93c8-725fb743c984","Handler":null,"Name":"","Endpoint":""} Apr 16 14:30:04.289544 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:04.289474 2563 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:30:04.289544 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:04.289508 2563 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:30:04.334705 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:04.334672 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:04.334882 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:04.334734 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:30:04.334882 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:04.334859 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:30:04.335199 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:04.335044 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:04.335199 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:04.335162 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:30:04.335356 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:04.335281 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:30:04.577003 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:04.576964 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" event={"ID":"f6c99dab-e384-40fd-849c-ac070671e4ea","Type":"ContainerStarted","Data":"b0a155f1f895673e1897d8f91536c2f18989c7f77d07dcd79c157025383d41a0"} Apr 16 14:30:04.577464 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:04.577014 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" event={"ID":"f6c99dab-e384-40fd-849c-ac070671e4ea","Type":"ContainerStarted","Data":"cd80b0a5d850403823ea1767eab7a2faae88fd9962c8ff4c561c489228ff6697"} Apr 16 14:30:04.579306 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:04.579143 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" event={"ID":"d527d52245c1df0987bf67706224db7f","Type":"ContainerStarted","Data":"6a1cfe8b46b54f2ffb1716a79466708881bdf1a3a1969dc7cdf71afc5b28f4cd"} Apr 16 14:30:04.606352 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:04.606285 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2z6v" podStartSLOduration=1.771264945 podStartE2EDuration="22.606265904s" podCreationTimestamp="2026-04-16 14:29:42 +0000 UTC" firstStartedPulling="2026-04-16 14:29:43.61661951 +0000 UTC m=+1.844349116" lastFinishedPulling="2026-04-16 14:30:04.451620457 +0000 UTC m=+22.679350075" observedRunningTime="2026-04-16 14:30:04.605474057 +0000 UTC m=+22.833203684" watchObservedRunningTime="2026-04-16 14:30:04.606265904 +0000 UTC m=+22.833995533" Apr 16 14:30:04.620981 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:04.620937 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-239.ec2.internal" podStartSLOduration=21.620922065 podStartE2EDuration="21.620922065s" podCreationTimestamp="2026-04-16 14:29:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:04.62020134 +0000 UTC m=+22.847930969" watchObservedRunningTime="2026-04-16 14:30:04.620922065 +0000 UTC m=+22.848651693" Apr 16 14:30:05.583647 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:05.583616 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:30:05.584207 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:05.584168 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" event={"ID":"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3","Type":"ContainerStarted","Data":"50358d2330d9137056d1e941819a377153efbc74859bbce14f8b09470177482b"} Apr 16 14:30:06.335376 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:06.335339 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:06.335573 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:06.335339 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:30:06.335573 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:06.335469 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:30:06.335573 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:06.335351 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:06.335754 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:06.335578 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:30:06.335754 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:06.335686 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:30:08.062332 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.061942 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2cqcm" Apr 16 14:30:08.062981 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.062480 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2cqcm" Apr 16 14:30:08.334780 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.334748 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:08.334969 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.334809 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:30:08.334969 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.334820 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:08.334969 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:08.334929 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:30:08.335084 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:08.335011 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:30:08.335128 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:08.335104 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:30:08.574197 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.574167 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2cqcm" Apr 16 14:30:08.574762 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.574742 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2cqcm" Apr 16 14:30:08.592161 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.592087 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:30:08.592407 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.592384 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" event={"ID":"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3","Type":"ContainerStarted","Data":"4b2336fc01706ad932698443d49e51c7ce04b38fc188e56f8e38c95e49cf571c"} Apr 16 14:30:08.592779 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.592758 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:30:08.592899 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.592786 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:30:08.592948 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.592919 2563 scope.go:117] "RemoveContainer" containerID="8876253e6a3b4ff27cd05e6cdd29954cb5f7662a52edf53956eeb836545ed7b5" Apr 16 14:30:08.594042 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.594022 2563 generic.go:358] "Generic (PLEG): container finished" podID="e891483d-413b-4b2d-a2d0-ee6b42f8ccbf" containerID="aeaf806bd1e774af0fab768e080b5780f78d02bd8293edf1a57844d1a5931e97" exitCode=0 Apr 16 14:30:08.594128 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.594112 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4p4tq" event={"ID":"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf","Type":"ContainerDied","Data":"aeaf806bd1e774af0fab768e080b5780f78d02bd8293edf1a57844d1a5931e97"} Apr 16 14:30:08.608326 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:08.608299 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:30:09.599379 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.599353 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:30:09.599902 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.599834 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" event={"ID":"29c3ac6b-dd94-4f4b-88ca-cf83af0046d3","Type":"ContainerStarted","Data":"e5615df45555577f5a3b29482b3be576fcd5ac8bc9825f687022bc8099c99bc6"} Apr 16 14:30:09.600176 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.600113 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:30:09.602704 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.602674 2563 generic.go:358] "Generic (PLEG): container finished" podID="e891483d-413b-4b2d-a2d0-ee6b42f8ccbf" containerID="71caadf34507dd7340da533f00661ef0da3f6971aae3437838518f577ceaedd2" exitCode=0 Apr 16 14:30:09.602859 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.602758 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4p4tq" event={"ID":"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf","Type":"ContainerDied","Data":"71caadf34507dd7340da533f00661ef0da3f6971aae3437838518f577ceaedd2"} Apr 16 14:30:09.613257 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.613225 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9fx7w"] Apr 16 14:30:09.613452 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.613361 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:30:09.613518 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:09.613484 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:30:09.616590 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.616490 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x5q8k"] Apr 16 14:30:09.616741 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.616650 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:09.616860 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:09.616747 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:30:09.617320 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.617294 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9hlkg"] Apr 16 14:30:09.617429 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.617402 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:09.617513 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:09.617491 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:30:09.619487 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.619463 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:30:09.630588 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:09.630522 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" podStartSLOduration=9.092756756 podStartE2EDuration="27.630506425s" podCreationTimestamp="2026-04-16 14:29:42 +0000 UTC" firstStartedPulling="2026-04-16 14:29:43.58599602 +0000 UTC m=+1.813725626" lastFinishedPulling="2026-04-16 14:30:02.123745686 +0000 UTC m=+20.351475295" observedRunningTime="2026-04-16 14:30:09.630506421 +0000 UTC m=+27.858236061" watchObservedRunningTime="2026-04-16 14:30:09.630506425 +0000 UTC m=+27.858236051" Apr 16 14:30:10.606631 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:10.606599 2563 generic.go:358] "Generic (PLEG): container finished" podID="e891483d-413b-4b2d-a2d0-ee6b42f8ccbf" containerID="b5a7e5b64572a5901ee16ae8fb3659ca9dc02442c6be111d5acedd93b5bc7c8f" exitCode=0 Apr 16 14:30:10.607101 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:10.606686 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4p4tq" event={"ID":"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf","Type":"ContainerDied","Data":"b5a7e5b64572a5901ee16ae8fb3659ca9dc02442c6be111d5acedd93b5bc7c8f"} Apr 16 14:30:11.334591 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:11.334558 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:11.334775 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:11.334556 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:30:11.334775 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:11.334686 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:30:11.334900 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:11.334791 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:30:11.334900 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:11.334556 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:11.334992 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:11.334903 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:30:13.334914 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:13.334864 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:13.334914 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:13.334904 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:30:13.335496 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:13.334864 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:13.335496 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:13.335005 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x5q8k" podUID="fd31aa30-3e27-4c57-ae7d-843fa27b25d3" Apr 16 14:30:13.335496 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:13.335079 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9hlkg" podUID="2edb25cc-c726-4b56-8a1b-f3877bff370e" Apr 16 14:30:13.335496 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:13.335190 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:30:14.060288 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.060074 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-239.ec2.internal" event="NodeReady" Apr 16 14:30:14.060426 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.060387 2563 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:30:14.097748 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.097716 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt"] Apr 16 14:30:14.127122 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.127091 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t"] Apr 16 14:30:14.127282 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.127227 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" Apr 16 14:30:14.129938 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.129897 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-fghmt\"" Apr 16 14:30:14.129938 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.129912 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 14:30:14.130177 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.129960 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 14:30:14.130486 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.130467 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 14:30:14.130600 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.130522 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 14:30:14.149886 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.149856 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g"] Apr 16 14:30:14.150057 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.150030 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.152447 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.152421 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 14:30:14.152612 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.152476 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 14:30:14.152612 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.152476 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 14:30:14.152612 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.152421 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 14:30:14.170456 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.170429 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw"] Apr 16 14:30:14.170614 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.170595 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.173100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.173051 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:30:14.173100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.173053 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:30:14.173100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.173057 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:30:14.173413 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.173393 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r52pv\"" Apr 16 14:30:14.179675 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.179655 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:30:14.188745 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.188726 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt"] Apr 16 14:30:14.188852 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.188752 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw"] Apr 16 14:30:14.188852 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.188764 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t"] Apr 16 14:30:14.188852 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.188779 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w2ldg"] Apr 16 14:30:14.188985 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.188883 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:14.192865 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.192129 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 14:30:14.211615 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.211584 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2c9qn"] Apr 16 14:30:14.211757 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.211720 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.216158 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.216136 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:30:14.216285 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.216170 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:30:14.216285 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.216184 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dqsrn\"" Apr 16 14:30:14.229486 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.229461 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g"] Apr 16 14:30:14.229486 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.229485 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w2ldg"] Apr 16 14:30:14.229655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.229494 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2c9qn"] Apr 16 14:30:14.229655 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.229618 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:30:14.232167 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.232145 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:30:14.232167 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.232145 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:30:14.232410 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.232354 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:30:14.240350 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.240329 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zpkvj\"" Apr 16 14:30:14.270990 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.270954 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-certificates\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.271172 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271056 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff4a3789-cd2d-43ee-8413-947a344bcef3-installation-pull-secrets\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.271172 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271093 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-bound-sa-token\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.271172 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271137 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g97g\" (UniqueName: \"kubernetes.io/projected/56a754a7-5475-4e0b-923a-a226b2357194-kube-api-access-8g97g\") pod \"managed-serviceaccount-addon-agent-86586dbfd9-2cxqt\" (UID: \"56a754a7-5475-4e0b-923a-a226b2357194\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" Apr 16 14:30:14.271303 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271180 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8pc4\" (UniqueName: \"kubernetes.io/projected/68f9da2d-9f0d-4f14-af18-0d79355138fa-kube-api-access-g8pc4\") pod \"klusterlet-addon-workmgr-5fb499bfb6-m5zzw\" (UID: \"68f9da2d-9f0d-4f14-af18-0d79355138fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:14.271303 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271238 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff4a3789-cd2d-43ee-8413-947a344bcef3-image-registry-private-configuration\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.271303 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271273 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/804957af-e065-45ad-a33e-6ce7f5097eb3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.271435 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271326 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fchb\" (UniqueName: \"kubernetes.io/projected/804957af-e065-45ad-a33e-6ce7f5097eb3-kube-api-access-8fchb\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.271435 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271344 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/68f9da2d-9f0d-4f14-af18-0d79355138fa-tmp\") pod \"klusterlet-addon-workmgr-5fb499bfb6-m5zzw\" (UID: \"68f9da2d-9f0d-4f14-af18-0d79355138fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:14.271435 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271372 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/804957af-e065-45ad-a33e-6ce7f5097eb3-ca\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.271435 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271402 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.271435 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271424 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff4a3789-cd2d-43ee-8413-947a344bcef3-trusted-ca\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.271692 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271442 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/804957af-e065-45ad-a33e-6ce7f5097eb3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.271692 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271508 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/56a754a7-5475-4e0b-923a-a226b2357194-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86586dbfd9-2cxqt\" (UID: \"56a754a7-5475-4e0b-923a-a226b2357194\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" Apr 16 14:30:14.271692 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271563 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/804957af-e065-45ad-a33e-6ce7f5097eb3-hub\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.271692 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271591 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3789-cd2d-43ee-8413-947a344bcef3-ca-trust-extracted\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.271692 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271617 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwqn\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-kube-api-access-nxwqn\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.271692 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271644 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/68f9da2d-9f0d-4f14-af18-0d79355138fa-klusterlet-config\") pod \"klusterlet-addon-workmgr-5fb499bfb6-m5zzw\" (UID: \"68f9da2d-9f0d-4f14-af18-0d79355138fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:14.271692 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.271688 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/804957af-e065-45ad-a33e-6ce7f5097eb3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.372387 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.372342 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/804957af-e065-45ad-a33e-6ce7f5097eb3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.372870 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.372395 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fchb\" (UniqueName: \"kubernetes.io/projected/804957af-e065-45ad-a33e-6ce7f5097eb3-kube-api-access-8fchb\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.372870 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.372569 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/68f9da2d-9f0d-4f14-af18-0d79355138fa-tmp\") pod \"klusterlet-addon-workmgr-5fb499bfb6-m5zzw\" (UID: \"68f9da2d-9f0d-4f14-af18-0d79355138fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:14.372870 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.372613 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6355259e-a870-4945-9b77-524fb13888c6-tmp-dir\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.372870 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.372665 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/804957af-e065-45ad-a33e-6ce7f5097eb3-ca\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.372870 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.372699 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.372870 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.372772 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff4a3789-cd2d-43ee-8413-947a344bcef3-trusted-ca\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.372870 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.372871 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:30:14.373180 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.372888 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g: secret "image-registry-tls" not found Apr 16 14:30:14.373180 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.372951 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls podName:ff4a3789-cd2d-43ee-8413-947a344bcef3 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:14.872934245 +0000 UTC m=+33.100663851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls") pod "image-registry-5dbc4dd5d5-8cm2g" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3") : secret "image-registry-tls" not found Apr 16 14:30:14.373180 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373128 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/804957af-e065-45ad-a33e-6ce7f5097eb3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.373180 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373155 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/56a754a7-5475-4e0b-923a-a226b2357194-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86586dbfd9-2cxqt\" (UID: \"56a754a7-5475-4e0b-923a-a226b2357194\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" Apr 16 14:30:14.373180 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373178 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/804957af-e065-45ad-a33e-6ce7f5097eb3-hub\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.373419 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373206 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3789-cd2d-43ee-8413-947a344bcef3-ca-trust-extracted\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.373419 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373235 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwqn\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-kube-api-access-nxwqn\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.373419 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373262 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/68f9da2d-9f0d-4f14-af18-0d79355138fa-klusterlet-config\") pod \"klusterlet-addon-workmgr-5fb499bfb6-m5zzw\" (UID: \"68f9da2d-9f0d-4f14-af18-0d79355138fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:14.373419 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373291 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.373419 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373321 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64sfg\" (UniqueName: \"kubernetes.io/projected/04ae3980-8ad0-4077-87e4-d094e30cba62-kube-api-access-64sfg\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:30:14.373681 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373659 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3789-cd2d-43ee-8413-947a344bcef3-ca-trust-extracted\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.373728 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373710 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/804957af-e065-45ad-a33e-6ce7f5097eb3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.373779 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373764 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-certificates\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.373825 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373803 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff4a3789-cd2d-43ee-8413-947a344bcef3-installation-pull-secrets\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.373873 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373829 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-bound-sa-token\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.373873 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373840 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff4a3789-cd2d-43ee-8413-947a344bcef3-trusted-ca\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.373873 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373859 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrg24\" (UniqueName: \"kubernetes.io/projected/6355259e-a870-4945-9b77-524fb13888c6-kube-api-access-lrg24\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.373991 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373176 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/68f9da2d-9f0d-4f14-af18-0d79355138fa-tmp\") pod \"klusterlet-addon-workmgr-5fb499bfb6-m5zzw\" (UID: \"68f9da2d-9f0d-4f14-af18-0d79355138fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:14.373991 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.373966 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8g97g\" (UniqueName: \"kubernetes.io/projected/56a754a7-5475-4e0b-923a-a226b2357194-kube-api-access-8g97g\") pod \"managed-serviceaccount-addon-agent-86586dbfd9-2cxqt\" (UID: \"56a754a7-5475-4e0b-923a-a226b2357194\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" Apr 16 14:30:14.374096 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.374007 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8pc4\" (UniqueName: \"kubernetes.io/projected/68f9da2d-9f0d-4f14-af18-0d79355138fa-kube-api-access-g8pc4\") pod \"klusterlet-addon-workmgr-5fb499bfb6-m5zzw\" (UID: \"68f9da2d-9f0d-4f14-af18-0d79355138fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:14.374096 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.374037 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6355259e-a870-4945-9b77-524fb13888c6-config-volume\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.374185 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.374110 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:30:14.374185 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.374148 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff4a3789-cd2d-43ee-8413-947a344bcef3-image-registry-private-configuration\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.374686 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.374617 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/804957af-e065-45ad-a33e-6ce7f5097eb3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.375958 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.375934 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-certificates\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.378360 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.378120 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/804957af-e065-45ad-a33e-6ce7f5097eb3-hub\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.378360 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.378162 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/804957af-e065-45ad-a33e-6ce7f5097eb3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.378565 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.378475 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/804957af-e065-45ad-a33e-6ce7f5097eb3-ca\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.378632 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.378611 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/68f9da2d-9f0d-4f14-af18-0d79355138fa-klusterlet-config\") pod \"klusterlet-addon-workmgr-5fb499bfb6-m5zzw\" (UID: \"68f9da2d-9f0d-4f14-af18-0d79355138fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:14.378710 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.378685 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/56a754a7-5475-4e0b-923a-a226b2357194-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86586dbfd9-2cxqt\" (UID: \"56a754a7-5475-4e0b-923a-a226b2357194\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" Apr 16 14:30:14.379302 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.379279 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/804957af-e065-45ad-a33e-6ce7f5097eb3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.379393 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.379328 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff4a3789-cd2d-43ee-8413-947a344bcef3-image-registry-private-configuration\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.380258 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.380237 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff4a3789-cd2d-43ee-8413-947a344bcef3-installation-pull-secrets\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.382627 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.382596 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwqn\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-kube-api-access-nxwqn\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.382627 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.382610 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fchb\" (UniqueName: \"kubernetes.io/projected/804957af-e065-45ad-a33e-6ce7f5097eb3-kube-api-access-8fchb\") pod \"cluster-proxy-proxy-agent-5d6df67564-h5v4t\" (UID: \"804957af-e065-45ad-a33e-6ce7f5097eb3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.386246 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.386224 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g97g\" (UniqueName: \"kubernetes.io/projected/56a754a7-5475-4e0b-923a-a226b2357194-kube-api-access-8g97g\") pod \"managed-serviceaccount-addon-agent-86586dbfd9-2cxqt\" (UID: \"56a754a7-5475-4e0b-923a-a226b2357194\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" Apr 16 14:30:14.386407 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.386388 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8pc4\" (UniqueName: \"kubernetes.io/projected/68f9da2d-9f0d-4f14-af18-0d79355138fa-kube-api-access-g8pc4\") pod \"klusterlet-addon-workmgr-5fb499bfb6-m5zzw\" (UID: \"68f9da2d-9f0d-4f14-af18-0d79355138fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:14.389769 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.389739 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-bound-sa-token\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.452341 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.452305 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" Apr 16 14:30:14.460377 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.460351 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:30:14.475315 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.475285 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrg24\" (UniqueName: \"kubernetes.io/projected/6355259e-a870-4945-9b77-524fb13888c6-kube-api-access-lrg24\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.475493 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.475336 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6355259e-a870-4945-9b77-524fb13888c6-config-volume\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.475580 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.475492 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:30:14.475580 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.475568 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6355259e-a870-4945-9b77-524fb13888c6-tmp-dir\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.475684 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.475640 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.475684 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.475665 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64sfg\" (UniqueName: \"kubernetes.io/projected/04ae3980-8ad0-4077-87e4-d094e30cba62-kube-api-access-64sfg\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:30:14.475780 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.475756 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:14.475846 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.475832 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert podName:04ae3980-8ad0-4077-87e4-d094e30cba62 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:14.975811547 +0000 UTC m=+33.203541166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert") pod "ingress-canary-2c9qn" (UID: "04ae3980-8ad0-4077-87e4-d094e30cba62") : secret "canary-serving-cert" not found Apr 16 14:30:14.475908 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.475891 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:14.475956 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.475931 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6355259e-a870-4945-9b77-524fb13888c6-tmp-dir\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.476001 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.475956 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls podName:6355259e-a870-4945-9b77-524fb13888c6 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:14.975937669 +0000 UTC m=+33.203667289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls") pod "dns-default-w2ldg" (UID: "6355259e-a870-4945-9b77-524fb13888c6") : secret "dns-default-metrics-tls" not found Apr 16 14:30:14.485289 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.485259 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64sfg\" (UniqueName: \"kubernetes.io/projected/04ae3980-8ad0-4077-87e4-d094e30cba62-kube-api-access-64sfg\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:30:14.486148 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.486104 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrg24\" (UniqueName: \"kubernetes.io/projected/6355259e-a870-4945-9b77-524fb13888c6-kube-api-access-lrg24\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.486270 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.486252 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6355259e-a870-4945-9b77-524fb13888c6-config-volume\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.500189 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.500161 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:14.879219 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.879178 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:14.879417 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.879290 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:30:14.879417 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.879311 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g: secret "image-registry-tls" not found Apr 16 14:30:14.879417 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.879376 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls podName:ff4a3789-cd2d-43ee-8413-947a344bcef3 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:15.879359827 +0000 UTC m=+34.107089433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls") pod "image-registry-5dbc4dd5d5-8cm2g" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3") : secret "image-registry-tls" not found Apr 16 14:30:14.980464 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.980423 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:30:14.981203 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.980488 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:30:14.981203 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:14.980569 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:14.981203 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.980602 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:30:14.981203 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.980629 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:14.981203 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.980675 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs podName:e7706545-6db6-4426-919c-bf83b5020047 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:46.980656639 +0000 UTC m=+65.208386250 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs") pod "network-metrics-daemon-9fx7w" (UID: "e7706545-6db6-4426-919c-bf83b5020047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:30:14.981203 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.980688 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:14.981203 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.980691 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert podName:04ae3980-8ad0-4077-87e4-d094e30cba62 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:15.980685544 +0000 UTC m=+34.208415153 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert") pod "ingress-canary-2c9qn" (UID: "04ae3980-8ad0-4077-87e4-d094e30cba62") : secret "canary-serving-cert" not found Apr 16 14:30:14.981203 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:14.980758 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls podName:6355259e-a870-4945-9b77-524fb13888c6 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:15.980745477 +0000 UTC m=+34.208475082 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls") pod "dns-default-w2ldg" (UID: "6355259e-a870-4945-9b77-524fb13888c6") : secret "dns-default-metrics-tls" not found Apr 16 14:30:15.081287 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.081256 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcghq\" (UniqueName: \"kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq\") pod \"network-check-target-9hlkg\" (UID: \"2edb25cc-c726-4b56-8a1b-f3877bff370e\") " pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:15.081478 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:15.081456 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:30:15.081592 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:15.081485 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:30:15.081592 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:15.081499 2563 projected.go:194] Error preparing data for projected volume kube-api-access-bcghq for pod openshift-network-diagnostics/network-check-target-9hlkg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:30:15.081592 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:15.081581 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq podName:2edb25cc-c726-4b56-8a1b-f3877bff370e nodeName:}" failed. No retries permitted until 2026-04-16 14:30:47.081559985 +0000 UTC m=+65.309289602 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-bcghq" (UniqueName: "kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq") pod "network-check-target-9hlkg" (UID: "2edb25cc-c726-4b56-8a1b-f3877bff370e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:30:15.334898 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.334858 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:30:15.335100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.334859 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:15.335303 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.334859 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:15.337310 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.337289 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:30:15.337438 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.337324 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:30:15.337438 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.337368 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bl9dn\"" Apr 16 14:30:15.337704 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.337686 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:30:15.338320 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.338300 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:30:15.338404 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.338342 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x8s5q\"" Apr 16 14:30:15.887396 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.887359 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:15.888082 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:15.887556 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:30:15.888082 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:15.887578 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g: secret "image-registry-tls" not found Apr 16 14:30:15.888082 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:15.887646 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls podName:ff4a3789-cd2d-43ee-8413-947a344bcef3 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:17.887625969 +0000 UTC m=+36.115355574 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls") pod "image-registry-5dbc4dd5d5-8cm2g" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3") : secret "image-registry-tls" not found Apr 16 14:30:15.988649 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.988614 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:30:15.988858 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:15.988691 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:15.988858 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:15.988789 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:15.988858 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:15.988804 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:15.989011 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:15.988874 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert podName:04ae3980-8ad0-4077-87e4-d094e30cba62 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:17.988852734 +0000 UTC m=+36.216582360 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert") pod "ingress-canary-2c9qn" (UID: "04ae3980-8ad0-4077-87e4-d094e30cba62") : secret "canary-serving-cert" not found Apr 16 14:30:15.989011 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:15.988894 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls podName:6355259e-a870-4945-9b77-524fb13888c6 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:17.98888469 +0000 UTC m=+36.216614311 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls") pod "dns-default-w2ldg" (UID: "6355259e-a870-4945-9b77-524fb13888c6") : secret "dns-default-metrics-tls" not found Apr 16 14:30:16.417725 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:16.417681 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt"] Apr 16 14:30:16.429550 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:16.429507 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw"] Apr 16 14:30:16.435765 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:16.435739 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t"] Apr 16 14:30:16.512640 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:30:16.512564 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56a754a7_5475_4e0b_923a_a226b2357194.slice/crio-de4e880c70c9cef2b25cb27601cba1df50384d3214518205a950b94aa65ac4e1 WatchSource:0}: Error finding container de4e880c70c9cef2b25cb27601cba1df50384d3214518205a950b94aa65ac4e1: Status 404 returned error can't find the container with id de4e880c70c9cef2b25cb27601cba1df50384d3214518205a950b94aa65ac4e1 Apr 16 14:30:16.513136 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:30:16.513095 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68f9da2d_9f0d_4f14_af18_0d79355138fa.slice/crio-497b04ea59a9793850fb28288e2ea2299adfa1cf863266905e6d81e17bc25148 WatchSource:0}: Error finding container 497b04ea59a9793850fb28288e2ea2299adfa1cf863266905e6d81e17bc25148: Status 404 returned error can't find the container with id 497b04ea59a9793850fb28288e2ea2299adfa1cf863266905e6d81e17bc25148 Apr 16 14:30:16.513669 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:30:16.513642 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod804957af_e065_45ad_a33e_6ce7f5097eb3.slice/crio-364e00aeb2a7d6435e54ccbee4c4443bda65e4fdb452e76868267f053eab9d3c WatchSource:0}: Error finding container 364e00aeb2a7d6435e54ccbee4c4443bda65e4fdb452e76868267f053eab9d3c: Status 404 returned error can't find the container with id 364e00aeb2a7d6435e54ccbee4c4443bda65e4fdb452e76868267f053eab9d3c Apr 16 14:30:16.620790 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:16.620763 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" event={"ID":"804957af-e065-45ad-a33e-6ce7f5097eb3","Type":"ContainerStarted","Data":"364e00aeb2a7d6435e54ccbee4c4443bda65e4fdb452e76868267f053eab9d3c"} Apr 16 14:30:16.621784 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:16.621756 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" event={"ID":"56a754a7-5475-4e0b-923a-a226b2357194","Type":"ContainerStarted","Data":"de4e880c70c9cef2b25cb27601cba1df50384d3214518205a950b94aa65ac4e1"} Apr 16 14:30:16.622989 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:16.622962 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" event={"ID":"68f9da2d-9f0d-4f14-af18-0d79355138fa","Type":"ContainerStarted","Data":"497b04ea59a9793850fb28288e2ea2299adfa1cf863266905e6d81e17bc25148"} Apr 16 14:30:16.796696 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:16.796312 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:16.800143 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:16.800115 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd31aa30-3e27-4c57-ae7d-843fa27b25d3-original-pull-secret\") pod \"global-pull-secret-syncer-x5q8k\" (UID: \"fd31aa30-3e27-4c57-ae7d-843fa27b25d3\") " pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:16.853991 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:16.853957 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x5q8k" Apr 16 14:30:16.968675 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:16.968642 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x5q8k"] Apr 16 14:30:16.971987 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:30:16.971960 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd31aa30_3e27_4c57_ae7d_843fa27b25d3.slice/crio-9777f7ac6e594f2d17aaacf3581f7c9a8c9dd4773858e627016b3c1432131b91 WatchSource:0}: Error finding container 9777f7ac6e594f2d17aaacf3581f7c9a8c9dd4773858e627016b3c1432131b91: Status 404 returned error can't find the container with id 9777f7ac6e594f2d17aaacf3581f7c9a8c9dd4773858e627016b3c1432131b91 Apr 16 14:30:17.629432 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:17.629281 2563 generic.go:358] "Generic (PLEG): container finished" podID="e891483d-413b-4b2d-a2d0-ee6b42f8ccbf" containerID="845c1b9ac9879ad33b8def2c462c40e3ccb47bc7c39f813a0aa80696602530b5" exitCode=0 Apr 16 14:30:17.629432 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:17.629387 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4p4tq" event={"ID":"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf","Type":"ContainerDied","Data":"845c1b9ac9879ad33b8def2c462c40e3ccb47bc7c39f813a0aa80696602530b5"} Apr 16 14:30:17.631861 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:17.631764 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x5q8k" event={"ID":"fd31aa30-3e27-4c57-ae7d-843fa27b25d3","Type":"ContainerStarted","Data":"9777f7ac6e594f2d17aaacf3581f7c9a8c9dd4773858e627016b3c1432131b91"} Apr 16 14:30:17.905901 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:17.905866 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:17.906078 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:17.906044 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:30:17.906078 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:17.906062 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g: secret "image-registry-tls" not found Apr 16 14:30:17.906192 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:17.906120 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls podName:ff4a3789-cd2d-43ee-8413-947a344bcef3 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:21.906101208 +0000 UTC m=+40.133830828 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls") pod "image-registry-5dbc4dd5d5-8cm2g" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3") : secret "image-registry-tls" not found Apr 16 14:30:18.007047 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:18.006765 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:30:18.007047 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:18.006868 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:18.007047 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:18.006923 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:18.007047 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:18.006976 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:18.007047 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:18.006988 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert podName:04ae3980-8ad0-4077-87e4-d094e30cba62 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:22.006969184 +0000 UTC m=+40.234698805 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert") pod "ingress-canary-2c9qn" (UID: "04ae3980-8ad0-4077-87e4-d094e30cba62") : secret "canary-serving-cert" not found Apr 16 14:30:18.007047 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:18.007028 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls podName:6355259e-a870-4945-9b77-524fb13888c6 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:22.00701892 +0000 UTC m=+40.234748530 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls") pod "dns-default-w2ldg" (UID: "6355259e-a870-4945-9b77-524fb13888c6") : secret "dns-default-metrics-tls" not found Apr 16 14:30:18.643309 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:18.643272 2563 generic.go:358] "Generic (PLEG): container finished" podID="e891483d-413b-4b2d-a2d0-ee6b42f8ccbf" containerID="163965a3fe3cb95387810fb26926ce227fbbab68ba52aa6b4e9f0e21b10fbe97" exitCode=0 Apr 16 14:30:18.643478 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:18.643322 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4p4tq" event={"ID":"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf","Type":"ContainerDied","Data":"163965a3fe3cb95387810fb26926ce227fbbab68ba52aa6b4e9f0e21b10fbe97"} Apr 16 14:30:21.945767 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:21.945721 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:21.946189 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:21.945879 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:30:21.946189 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:21.945899 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g: secret "image-registry-tls" not found Apr 16 14:30:21.946189 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:21.945955 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls podName:ff4a3789-cd2d-43ee-8413-947a344bcef3 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:29.945939792 +0000 UTC m=+48.173669397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls") pod "image-registry-5dbc4dd5d5-8cm2g" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3") : secret "image-registry-tls" not found Apr 16 14:30:22.046946 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:22.046906 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:22.047115 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:22.047012 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:30:22.047115 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:22.047096 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:22.047202 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:22.047118 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:22.047202 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:22.047183 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert podName:04ae3980-8ad0-4077-87e4-d094e30cba62 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:30.047162751 +0000 UTC m=+48.274892362 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert") pod "ingress-canary-2c9qn" (UID: "04ae3980-8ad0-4077-87e4-d094e30cba62") : secret "canary-serving-cert" not found Apr 16 14:30:22.047305 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:22.047204 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls podName:6355259e-a870-4945-9b77-524fb13888c6 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:30.047194843 +0000 UTC m=+48.274924450 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls") pod "dns-default-w2ldg" (UID: "6355259e-a870-4945-9b77-524fb13888c6") : secret "dns-default-metrics-tls" not found Apr 16 14:30:25.660470 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:25.660433 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x5q8k" event={"ID":"fd31aa30-3e27-4c57-ae7d-843fa27b25d3","Type":"ContainerStarted","Data":"8adc49304cf42f2e32c14c4eae9dc6ae12d53a93b29f99c19f3f6a0f85cebf96"} Apr 16 14:30:25.663563 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:25.663517 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4p4tq" event={"ID":"e891483d-413b-4b2d-a2d0-ee6b42f8ccbf","Type":"ContainerStarted","Data":"c81901ac3c16b4c9b54340dda4cce7fb23af1cf48bd0c58147acdb93c761ebc0"} Apr 16 14:30:25.664886 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:25.664862 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" event={"ID":"68f9da2d-9f0d-4f14-af18-0d79355138fa","Type":"ContainerStarted","Data":"702fe788ee469104a0383f54f92eba990bf49e2b7cc6314f9e0690f9e089087f"} Apr 16 14:30:25.665041 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:25.665025 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:25.666144 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:25.666124 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" event={"ID":"804957af-e065-45ad-a33e-6ce7f5097eb3","Type":"ContainerStarted","Data":"b9923434124e4ce84b047c143f6f6d8d60289c5ce36338c226154425759c23bf"} Apr 16 14:30:25.666954 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:25.666935 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:30:25.667428 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:25.667410 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" event={"ID":"56a754a7-5475-4e0b-923a-a226b2357194","Type":"ContainerStarted","Data":"14a07c1bb7e5955dfd132415110041e2b032959c97687f34d91c354141fdf28c"} Apr 16 14:30:25.676841 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:25.676802 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-x5q8k" podStartSLOduration=33.645911605 podStartE2EDuration="41.676790159s" podCreationTimestamp="2026-04-16 14:29:44 +0000 UTC" firstStartedPulling="2026-04-16 14:30:16.973732644 +0000 UTC m=+35.201462249" lastFinishedPulling="2026-04-16 14:30:25.004611181 +0000 UTC m=+43.232340803" observedRunningTime="2026-04-16 14:30:25.676191111 +0000 UTC m=+43.903920743" watchObservedRunningTime="2026-04-16 14:30:25.676790159 +0000 UTC m=+43.904519816" Apr 16 14:30:25.694276 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:25.694225 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" podStartSLOduration=27.192368756 podStartE2EDuration="35.694212019s" podCreationTimestamp="2026-04-16 14:29:50 +0000 UTC" firstStartedPulling="2026-04-16 14:30:16.520941952 +0000 UTC m=+34.748671560" lastFinishedPulling="2026-04-16 14:30:25.022785217 +0000 UTC m=+43.250514823" observedRunningTime="2026-04-16 14:30:25.693596377 +0000 UTC m=+43.921326006" watchObservedRunningTime="2026-04-16 14:30:25.694212019 +0000 UTC m=+43.921941688" Apr 16 14:30:25.719206 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:25.716494 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4p4tq" podStartSLOduration=10.723484331 podStartE2EDuration="43.716477067s" podCreationTimestamp="2026-04-16 14:29:42 +0000 UTC" firstStartedPulling="2026-04-16 14:29:43.550462015 +0000 UTC m=+1.778191622" lastFinishedPulling="2026-04-16 14:30:16.543454753 +0000 UTC m=+34.771184358" observedRunningTime="2026-04-16 14:30:25.715140166 +0000 UTC m=+43.942869806" watchObservedRunningTime="2026-04-16 14:30:25.716477067 +0000 UTC m=+43.944206696" Apr 16 14:30:25.733189 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:25.733143 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" podStartSLOduration=27.250435765 podStartE2EDuration="35.733129998s" podCreationTimestamp="2026-04-16 14:29:50 +0000 UTC" firstStartedPulling="2026-04-16 14:30:16.521341622 +0000 UTC m=+34.749071228" lastFinishedPulling="2026-04-16 14:30:25.004035841 +0000 UTC m=+43.231765461" observedRunningTime="2026-04-16 14:30:25.732096278 +0000 UTC m=+43.959825907" watchObservedRunningTime="2026-04-16 14:30:25.733129998 +0000 UTC m=+43.960859627" Apr 16 14:30:28.676333 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:28.676290 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" event={"ID":"804957af-e065-45ad-a33e-6ce7f5097eb3","Type":"ContainerStarted","Data":"94b191e6a7b8f2119551a966016b02e0730bee8803754d699afd685942bbbdae"} Apr 16 14:30:28.676333 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:28.676330 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" event={"ID":"804957af-e065-45ad-a33e-6ce7f5097eb3","Type":"ContainerStarted","Data":"1f0a720b1d15c273a175e9a47a863230b3a75c09d9847ca4e162bea006267a44"} Apr 16 14:30:28.697298 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:28.697240 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" podStartSLOduration=27.603616739 podStartE2EDuration="38.697223035s" podCreationTimestamp="2026-04-16 14:29:50 +0000 UTC" firstStartedPulling="2026-04-16 14:30:16.521186113 +0000 UTC m=+34.748915720" lastFinishedPulling="2026-04-16 14:30:27.614792398 +0000 UTC m=+45.842522016" observedRunningTime="2026-04-16 14:30:28.696250918 +0000 UTC m=+46.923980545" watchObservedRunningTime="2026-04-16 14:30:28.697223035 +0000 UTC m=+46.924952664" Apr 16 14:30:30.006814 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:30.006773 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:30.007214 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:30.006917 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:30:30.007214 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:30.006936 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g: secret "image-registry-tls" not found Apr 16 14:30:30.007214 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:30.006992 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls podName:ff4a3789-cd2d-43ee-8413-947a344bcef3 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:46.006977218 +0000 UTC m=+64.234706823 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls") pod "image-registry-5dbc4dd5d5-8cm2g" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3") : secret "image-registry-tls" not found Apr 16 14:30:30.107962 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:30.107925 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:30:30.108139 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:30.107989 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:30.108139 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:30.108072 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:30.108139 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:30.108107 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:30.108139 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:30.108137 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert podName:04ae3980-8ad0-4077-87e4-d094e30cba62 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:46.108121209 +0000 UTC m=+64.335850816 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert") pod "ingress-canary-2c9qn" (UID: "04ae3980-8ad0-4077-87e4-d094e30cba62") : secret "canary-serving-cert" not found Apr 16 14:30:30.108317 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:30.108153 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls podName:6355259e-a870-4945-9b77-524fb13888c6 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:46.108143215 +0000 UTC m=+64.335872821 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls") pod "dns-default-w2ldg" (UID: "6355259e-a870-4945-9b77-524fb13888c6") : secret "dns-default-metrics-tls" not found Apr 16 14:30:41.619598 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:41.619565 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8wdkz" Apr 16 14:30:46.025632 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:46.025584 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:30:46.026032 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:46.025734 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:30:46.026032 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:46.025749 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g: secret "image-registry-tls" not found Apr 16 14:30:46.026032 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:46.025807 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls podName:ff4a3789-cd2d-43ee-8413-947a344bcef3 nodeName:}" failed. No retries permitted until 2026-04-16 14:31:18.025789638 +0000 UTC m=+96.253519268 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls") pod "image-registry-5dbc4dd5d5-8cm2g" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3") : secret "image-registry-tls" not found Apr 16 14:30:46.126521 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:46.126488 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:30:46.126662 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:46.126589 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:30:46.126662 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:46.126648 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:46.126732 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:46.126671 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:46.126732 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:46.126715 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls podName:6355259e-a870-4945-9b77-524fb13888c6 nodeName:}" failed. No retries permitted until 2026-04-16 14:31:18.126697635 +0000 UTC m=+96.354427241 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls") pod "dns-default-w2ldg" (UID: "6355259e-a870-4945-9b77-524fb13888c6") : secret "dns-default-metrics-tls" not found Apr 16 14:30:46.126732 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:46.126732 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert podName:04ae3980-8ad0-4077-87e4-d094e30cba62 nodeName:}" failed. No retries permitted until 2026-04-16 14:31:18.126723205 +0000 UTC m=+96.354452814 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert") pod "ingress-canary-2c9qn" (UID: "04ae3980-8ad0-4077-87e4-d094e30cba62") : secret "canary-serving-cert" not found Apr 16 14:30:47.035886 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:47.035843 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:30:47.038843 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:47.038822 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:30:47.046222 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:47.046200 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:30:47.046274 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:30:47.046257 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs podName:e7706545-6db6-4426-919c-bf83b5020047 nodeName:}" failed. No retries permitted until 2026-04-16 14:31:51.04624236 +0000 UTC m=+129.273971970 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs") pod "network-metrics-daemon-9fx7w" (UID: "e7706545-6db6-4426-919c-bf83b5020047") : secret "metrics-daemon-secret" not found Apr 16 14:30:47.136931 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:47.136896 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcghq\" (UniqueName: \"kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq\") pod \"network-check-target-9hlkg\" (UID: \"2edb25cc-c726-4b56-8a1b-f3877bff370e\") " pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:47.140029 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:47.140010 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:30:47.149805 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:47.149780 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:30:47.161357 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:47.161322 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcghq\" (UniqueName: \"kubernetes.io/projected/2edb25cc-c726-4b56-8a1b-f3877bff370e-kube-api-access-bcghq\") pod \"network-check-target-9hlkg\" (UID: \"2edb25cc-c726-4b56-8a1b-f3877bff370e\") " pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:47.463431 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:47.463397 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x8s5q\"" Apr 16 14:30:47.471378 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:47.471357 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:47.590452 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:47.590418 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9hlkg"] Apr 16 14:30:47.593686 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:30:47.593656 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2edb25cc_c726_4b56_8a1b_f3877bff370e.slice/crio-614a6234fd4620a1d61a45e8adb278154cc54f6064c78c73153e47fa21521cb8 WatchSource:0}: Error finding container 614a6234fd4620a1d61a45e8adb278154cc54f6064c78c73153e47fa21521cb8: Status 404 returned error can't find the container with id 614a6234fd4620a1d61a45e8adb278154cc54f6064c78c73153e47fa21521cb8 Apr 16 14:30:47.727942 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:47.727852 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9hlkg" event={"ID":"2edb25cc-c726-4b56-8a1b-f3877bff370e","Type":"ContainerStarted","Data":"614a6234fd4620a1d61a45e8adb278154cc54f6064c78c73153e47fa21521cb8"} Apr 16 14:30:51.739413 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:51.739371 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9hlkg" event={"ID":"2edb25cc-c726-4b56-8a1b-f3877bff370e","Type":"ContainerStarted","Data":"2241587b9105f1102519a304db0d2c55fb701584610af69aa598fdc723aab35d"} Apr 16 14:30:51.739832 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:51.739499 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:30:51.763131 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:30:51.763078 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9hlkg" podStartSLOduration=66.638274526 podStartE2EDuration="1m9.763064418s" podCreationTimestamp="2026-04-16 14:29:42 +0000 UTC" firstStartedPulling="2026-04-16 14:30:47.595792412 +0000 UTC m=+65.823522021" lastFinishedPulling="2026-04-16 14:30:50.720582305 +0000 UTC m=+68.948311913" observedRunningTime="2026-04-16 14:30:51.761806389 +0000 UTC m=+69.989536018" watchObservedRunningTime="2026-04-16 14:30:51.763064418 +0000 UTC m=+69.990794036" Apr 16 14:31:18.082987 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:31:18.082836 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:31:18.082987 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:31:18.082984 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:31:18.083519 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:31:18.083005 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g: secret "image-registry-tls" not found Apr 16 14:31:18.083519 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:31:18.083058 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls podName:ff4a3789-cd2d-43ee-8413-947a344bcef3 nodeName:}" failed. No retries permitted until 2026-04-16 14:32:22.08304282 +0000 UTC m=+160.310772429 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls") pod "image-registry-5dbc4dd5d5-8cm2g" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3") : secret "image-registry-tls" not found Apr 16 14:31:18.183709 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:31:18.183672 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:31:18.183910 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:31:18.183733 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:31:18.183910 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:31:18.183813 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:31:18.183910 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:31:18.183816 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:31:18.183910 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:31:18.183870 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls podName:6355259e-a870-4945-9b77-524fb13888c6 nodeName:}" failed. No retries permitted until 2026-04-16 14:32:22.183857162 +0000 UTC m=+160.411586772 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls") pod "dns-default-w2ldg" (UID: "6355259e-a870-4945-9b77-524fb13888c6") : secret "dns-default-metrics-tls" not found Apr 16 14:31:18.183910 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:31:18.183882 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert podName:04ae3980-8ad0-4077-87e4-d094e30cba62 nodeName:}" failed. No retries permitted until 2026-04-16 14:32:22.183876728 +0000 UTC m=+160.411606334 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert") pod "ingress-canary-2c9qn" (UID: "04ae3980-8ad0-4077-87e4-d094e30cba62") : secret "canary-serving-cert" not found Apr 16 14:31:22.744693 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:31:22.744658 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9hlkg" Apr 16 14:31:51.128613 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:31:51.128572 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:31:51.129126 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:31:51.128689 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:31:51.129126 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:31:51.128756 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs podName:e7706545-6db6-4426-919c-bf83b5020047 nodeName:}" failed. No retries permitted until 2026-04-16 14:33:53.128742895 +0000 UTC m=+251.356472500 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs") pod "network-metrics-daemon-9fx7w" (UID: "e7706545-6db6-4426-919c-bf83b5020047") : secret "metrics-daemon-secret" not found Apr 16 14:31:53.430933 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:31:53.430897 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ktxmg_8c240568-6c79-4c9e-af56-ea680b0f0410/dns-node-resolver/0.log" Apr 16 14:31:54.830720 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:31:54.830691 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fvd5j_898da78a-88a7-4608-baa8-e5a6bdba777f/node-ca/0.log" Apr 16 14:32:15.564602 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.564564 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-45bjl"] Apr 16 14:32:15.567615 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.567595 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.571738 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.571718 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-k65jf\"" Apr 16 14:32:15.574173 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.574147 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:32:15.574416 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.574396 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:32:15.574500 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.574403 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:32:15.574500 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.574465 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:32:15.600026 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.599988 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-45bjl"] Apr 16 14:32:15.620317 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.620288 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e3c4378b-8642-4e54-b206-e56cbc51fa51-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.620471 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.620325 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e3c4378b-8642-4e54-b206-e56cbc51fa51-data-volume\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.620471 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.620399 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97qsn\" (UniqueName: \"kubernetes.io/projected/e3c4378b-8642-4e54-b206-e56cbc51fa51-kube-api-access-97qsn\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.620471 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.620418 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e3c4378b-8642-4e54-b206-e56cbc51fa51-crio-socket\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.620471 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.620446 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e3c4378b-8642-4e54-b206-e56cbc51fa51-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.721005 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.720962 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97qsn\" (UniqueName: \"kubernetes.io/projected/e3c4378b-8642-4e54-b206-e56cbc51fa51-kube-api-access-97qsn\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.721005 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.721003 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e3c4378b-8642-4e54-b206-e56cbc51fa51-crio-socket\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.721271 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.721129 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e3c4378b-8642-4e54-b206-e56cbc51fa51-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.721271 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.721163 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e3c4378b-8642-4e54-b206-e56cbc51fa51-crio-socket\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.721271 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.721210 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e3c4378b-8642-4e54-b206-e56cbc51fa51-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.721271 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.721253 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e3c4378b-8642-4e54-b206-e56cbc51fa51-data-volume\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.721621 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.721602 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e3c4378b-8642-4e54-b206-e56cbc51fa51-data-volume\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.721791 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.721776 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e3c4378b-8642-4e54-b206-e56cbc51fa51-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.723619 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.723604 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e3c4378b-8642-4e54-b206-e56cbc51fa51-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.732997 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.732975 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97qsn\" (UniqueName: \"kubernetes.io/projected/e3c4378b-8642-4e54-b206-e56cbc51fa51-kube-api-access-97qsn\") pod \"insights-runtime-extractor-45bjl\" (UID: \"e3c4378b-8642-4e54-b206-e56cbc51fa51\") " pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.876984 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.876951 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-45bjl" Apr 16 14:32:15.997330 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:15.997297 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-45bjl"] Apr 16 14:32:16.000714 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:32:16.000679 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3c4378b_8642_4e54_b206_e56cbc51fa51.slice/crio-7ce999e7833ad6ea528b0e773e527c6b169a5dea62834d47c675ed76dccc5feb WatchSource:0}: Error finding container 7ce999e7833ad6ea528b0e773e527c6b169a5dea62834d47c675ed76dccc5feb: Status 404 returned error can't find the container with id 7ce999e7833ad6ea528b0e773e527c6b169a5dea62834d47c675ed76dccc5feb Apr 16 14:32:16.941813 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:16.941735 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-45bjl" event={"ID":"e3c4378b-8642-4e54-b206-e56cbc51fa51","Type":"ContainerStarted","Data":"b75a344b00d072a56698ebc497584620d56be8963f518fa5f05d4cd8d95a0f4c"} Apr 16 14:32:16.941813 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:16.941767 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-45bjl" event={"ID":"e3c4378b-8642-4e54-b206-e56cbc51fa51","Type":"ContainerStarted","Data":"59afd371575466f9db0193f514fa70650ee36d58dcf57cdb9d967b3f73144075"} Apr 16 14:32:16.941813 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:16.941777 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-45bjl" event={"ID":"e3c4378b-8642-4e54-b206-e56cbc51fa51","Type":"ContainerStarted","Data":"7ce999e7833ad6ea528b0e773e527c6b169a5dea62834d47c675ed76dccc5feb"} Apr 16 14:32:17.191214 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:32:17.191174 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" podUID="ff4a3789-cd2d-43ee-8413-947a344bcef3" Apr 16 14:32:17.221386 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:32:17.221297 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-w2ldg" podUID="6355259e-a870-4945-9b77-524fb13888c6" Apr 16 14:32:17.239637 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:32:17.239598 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2c9qn" podUID="04ae3980-8ad0-4077-87e4-d094e30cba62" Apr 16 14:32:17.944797 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:17.944768 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w2ldg" Apr 16 14:32:17.945140 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:17.944768 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:32:18.346808 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:32:18.346753 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9fx7w" podUID="e7706545-6db6-4426-919c-bf83b5020047" Apr 16 14:32:18.948788 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:18.948751 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-45bjl" event={"ID":"e3c4378b-8642-4e54-b206-e56cbc51fa51","Type":"ContainerStarted","Data":"bfe9297eb8591108792da34e0638f1c26a4477307f29d18b61a62acb99e22e37"} Apr 16 14:32:18.967492 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:18.967450 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-45bjl" podStartSLOduration=2.050020872 podStartE2EDuration="3.967438298s" podCreationTimestamp="2026-04-16 14:32:15 +0000 UTC" firstStartedPulling="2026-04-16 14:32:16.053031902 +0000 UTC m=+154.280761512" lastFinishedPulling="2026-04-16 14:32:17.97044933 +0000 UTC m=+156.198178938" observedRunningTime="2026-04-16 14:32:18.965939155 +0000 UTC m=+157.193668784" watchObservedRunningTime="2026-04-16 14:32:18.967438298 +0000 UTC m=+157.195167950" Apr 16 14:32:22.172427 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.172390 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:32:22.174745 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.174726 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") pod \"image-registry-5dbc4dd5d5-8cm2g\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:32:22.272705 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.272674 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:32:22.272862 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.272746 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:32:22.274971 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.274947 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6355259e-a870-4945-9b77-524fb13888c6-metrics-tls\") pod \"dns-default-w2ldg\" (UID: \"6355259e-a870-4945-9b77-524fb13888c6\") " pod="openshift-dns/dns-default-w2ldg" Apr 16 14:32:22.275065 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.274996 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/04ae3980-8ad0-4077-87e4-d094e30cba62-cert\") pod \"ingress-canary-2c9qn\" (UID: \"04ae3980-8ad0-4077-87e4-d094e30cba62\") " pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:32:22.448805 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.448735 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dqsrn\"" Apr 16 14:32:22.448805 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.448743 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r52pv\"" Apr 16 14:32:22.455781 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.455761 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:32:22.455920 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.455906 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w2ldg" Apr 16 14:32:22.580367 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.580340 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w2ldg"] Apr 16 14:32:22.583717 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:32:22.583691 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6355259e_a870_4945_9b77_524fb13888c6.slice/crio-7904723922ee681a46e97138bc45824cd3c777964f0ef409b56be6415e6793cf WatchSource:0}: Error finding container 7904723922ee681a46e97138bc45824cd3c777964f0ef409b56be6415e6793cf: Status 404 returned error can't find the container with id 7904723922ee681a46e97138bc45824cd3c777964f0ef409b56be6415e6793cf Apr 16 14:32:22.598776 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.598750 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g"] Apr 16 14:32:22.601704 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:32:22.601678 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff4a3789_cd2d_43ee_8413_947a344bcef3.slice/crio-42885cb5509392f8995374632b936f83ae90aa4f7b3e35537d5daecacfdd000d WatchSource:0}: Error finding container 42885cb5509392f8995374632b936f83ae90aa4f7b3e35537d5daecacfdd000d: Status 404 returned error can't find the container with id 42885cb5509392f8995374632b936f83ae90aa4f7b3e35537d5daecacfdd000d Apr 16 14:32:22.959442 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.959403 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" event={"ID":"ff4a3789-cd2d-43ee-8413-947a344bcef3","Type":"ContainerStarted","Data":"74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359"} Apr 16 14:32:22.959657 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.959451 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" event={"ID":"ff4a3789-cd2d-43ee-8413-947a344bcef3","Type":"ContainerStarted","Data":"42885cb5509392f8995374632b936f83ae90aa4f7b3e35537d5daecacfdd000d"} Apr 16 14:32:22.959657 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.959504 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:32:22.960554 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.960512 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w2ldg" event={"ID":"6355259e-a870-4945-9b77-524fb13888c6","Type":"ContainerStarted","Data":"7904723922ee681a46e97138bc45824cd3c777964f0ef409b56be6415e6793cf"} Apr 16 14:32:22.984554 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:22.982476 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" podStartSLOduration=139.982451633 podStartE2EDuration="2m19.982451633s" podCreationTimestamp="2026-04-16 14:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:32:22.981712875 +0000 UTC m=+161.209442504" watchObservedRunningTime="2026-04-16 14:32:22.982451633 +0000 UTC m=+161.210181259" Apr 16 14:32:24.967745 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:24.967705 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w2ldg" event={"ID":"6355259e-a870-4945-9b77-524fb13888c6","Type":"ContainerStarted","Data":"bd00df82964adb2ca73bdfd3f3005a0d186f01cf1205e4843318040e3ca0623d"} Apr 16 14:32:24.968188 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:24.967751 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w2ldg" event={"ID":"6355259e-a870-4945-9b77-524fb13888c6","Type":"ContainerStarted","Data":"07dddb9c87e662d829e552a2a16d09298d1d0a39fe7e7d44797a6fe680540a15"} Apr 16 14:32:24.968188 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:24.967905 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-w2ldg" Apr 16 14:32:24.986447 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:24.986404 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w2ldg" podStartSLOduration=129.663355761 podStartE2EDuration="2m10.986389446s" podCreationTimestamp="2026-04-16 14:30:14 +0000 UTC" firstStartedPulling="2026-04-16 14:32:22.585424383 +0000 UTC m=+160.813153992" lastFinishedPulling="2026-04-16 14:32:23.908458072 +0000 UTC m=+162.136187677" observedRunningTime="2026-04-16 14:32:24.98477713 +0000 UTC m=+163.212506772" watchObservedRunningTime="2026-04-16 14:32:24.986389446 +0000 UTC m=+163.214119077" Apr 16 14:32:25.665692 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:25.665628 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" podUID="68f9da2d-9f0d-4f14-af18-0d79355138fa" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.8:8000/readyz\": dial tcp 10.132.0.8:8000: connect: connection refused" Apr 16 14:32:25.971979 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:25.971897 2563 generic.go:358] "Generic (PLEG): container finished" podID="56a754a7-5475-4e0b-923a-a226b2357194" containerID="14a07c1bb7e5955dfd132415110041e2b032959c97687f34d91c354141fdf28c" exitCode=255 Apr 16 14:32:25.972343 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:25.971975 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" event={"ID":"56a754a7-5475-4e0b-923a-a226b2357194","Type":"ContainerDied","Data":"14a07c1bb7e5955dfd132415110041e2b032959c97687f34d91c354141fdf28c"} Apr 16 14:32:25.972343 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:25.972306 2563 scope.go:117] "RemoveContainer" containerID="14a07c1bb7e5955dfd132415110041e2b032959c97687f34d91c354141fdf28c" Apr 16 14:32:25.973236 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:25.973217 2563 generic.go:358] "Generic (PLEG): container finished" podID="68f9da2d-9f0d-4f14-af18-0d79355138fa" containerID="702fe788ee469104a0383f54f92eba990bf49e2b7cc6314f9e0690f9e089087f" exitCode=1 Apr 16 14:32:25.973328 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:25.973292 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" event={"ID":"68f9da2d-9f0d-4f14-af18-0d79355138fa","Type":"ContainerDied","Data":"702fe788ee469104a0383f54f92eba990bf49e2b7cc6314f9e0690f9e089087f"} Apr 16 14:32:25.973724 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:25.973706 2563 scope.go:117] "RemoveContainer" containerID="702fe788ee469104a0383f54f92eba990bf49e2b7cc6314f9e0690f9e089087f" Apr 16 14:32:26.977664 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:26.977626 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86586dbfd9-2cxqt" event={"ID":"56a754a7-5475-4e0b-923a-a226b2357194","Type":"ContainerStarted","Data":"bdc0388506bbf03fb19bc9659bccdbc36ee7ce1cb2f54c7b3a5b5f97a575a96f"} Apr 16 14:32:26.979050 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:26.979027 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" event={"ID":"68f9da2d-9f0d-4f14-af18-0d79355138fa","Type":"ContainerStarted","Data":"6d501c1e35aff80e8a62bcc5f3b4387869257333755ba1a5b4a8e7caaa594157"} Apr 16 14:32:26.979304 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:26.979283 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:32:26.979866 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:26.979850 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5fb499bfb6-m5zzw" Apr 16 14:32:31.728816 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.728737 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hqvsk"] Apr 16 14:32:31.731835 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.731816 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.736790 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.736771 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:32:31.736912 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.736801 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6npkq\"" Apr 16 14:32:31.736912 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.736907 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:32:31.738271 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.737914 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:32:31.738271 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.738020 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:32:31.738461 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.738426 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:32:31.739140 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.739116 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:32:31.838869 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.838835 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-textfile\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.839021 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.838892 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88255980-f2f9-48e3-847a-b8daedd7edc4-metrics-client-ca\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.839021 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.838967 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/88255980-f2f9-48e3-847a-b8daedd7edc4-root\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.839021 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.839010 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/88255980-f2f9-48e3-847a-b8daedd7edc4-sys\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.839172 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.839029 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-tls\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.839172 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.839065 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxzqm\" (UniqueName: \"kubernetes.io/projected/88255980-f2f9-48e3-847a-b8daedd7edc4-kube-api-access-dxzqm\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.839172 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.839105 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-accelerators-collector-config\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.839172 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.839137 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-wtmp\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.839172 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.839167 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.940215 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.940183 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/88255980-f2f9-48e3-847a-b8daedd7edc4-sys\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.940215 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.940218 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-tls\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.940436 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.940260 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxzqm\" (UniqueName: \"kubernetes.io/projected/88255980-f2f9-48e3-847a-b8daedd7edc4-kube-api-access-dxzqm\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.940436 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.940279 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-accelerators-collector-config\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.940436 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.940278 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/88255980-f2f9-48e3-847a-b8daedd7edc4-sys\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.940436 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.940298 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-wtmp\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.940436 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.940339 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.940436 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.940392 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-textfile\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.940436 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.940407 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-wtmp\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.940736 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.940454 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88255980-f2f9-48e3-847a-b8daedd7edc4-metrics-client-ca\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.940736 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.940472 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/88255980-f2f9-48e3-847a-b8daedd7edc4-root\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.940736 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.940555 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/88255980-f2f9-48e3-847a-b8daedd7edc4-root\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.941359 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.941340 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-textfile\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.941523 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.941504 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88255980-f2f9-48e3-847a-b8daedd7edc4-metrics-client-ca\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.941625 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.941600 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-accelerators-collector-config\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.942559 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.942541 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.942715 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.942694 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/88255980-f2f9-48e3-847a-b8daedd7edc4-node-exporter-tls\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:31.948710 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:31.948688 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxzqm\" (UniqueName: \"kubernetes.io/projected/88255980-f2f9-48e3-847a-b8daedd7edc4-kube-api-access-dxzqm\") pod \"node-exporter-hqvsk\" (UID: \"88255980-f2f9-48e3-847a-b8daedd7edc4\") " pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:32.043813 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:32.043745 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hqvsk" Apr 16 14:32:32.052206 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:32:32.052169 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88255980_f2f9_48e3_847a_b8daedd7edc4.slice/crio-2394edefe351bfc7ddcbf2c315541f324aefba1e9c9a1a6ba179d44242b99dc1 WatchSource:0}: Error finding container 2394edefe351bfc7ddcbf2c315541f324aefba1e9c9a1a6ba179d44242b99dc1: Status 404 returned error can't find the container with id 2394edefe351bfc7ddcbf2c315541f324aefba1e9c9a1a6ba179d44242b99dc1 Apr 16 14:32:32.336553 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:32.336514 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:32:32.341486 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:32.341465 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zpkvj\"" Apr 16 14:32:32.347604 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:32.347586 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2c9qn" Apr 16 14:32:32.465451 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:32.465420 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2c9qn"] Apr 16 14:32:32.468902 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:32:32.468871 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04ae3980_8ad0_4077_87e4_d094e30cba62.slice/crio-2a3ea197816443d77d00f55f42e13172710ba29444ff57a6172483662f33c0c6 WatchSource:0}: Error finding container 2a3ea197816443d77d00f55f42e13172710ba29444ff57a6172483662f33c0c6: Status 404 returned error can't find the container with id 2a3ea197816443d77d00f55f42e13172710ba29444ff57a6172483662f33c0c6 Apr 16 14:32:32.996467 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:32.996432 2563 generic.go:358] "Generic (PLEG): container finished" podID="88255980-f2f9-48e3-847a-b8daedd7edc4" containerID="abbb879ddab8a3df977d4a0446c058db5b2252ea936090cc9c7bc49649487bda" exitCode=0 Apr 16 14:32:32.996912 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:32.996514 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hqvsk" event={"ID":"88255980-f2f9-48e3-847a-b8daedd7edc4","Type":"ContainerDied","Data":"abbb879ddab8a3df977d4a0446c058db5b2252ea936090cc9c7bc49649487bda"} Apr 16 14:32:32.996912 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:32.996570 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hqvsk" event={"ID":"88255980-f2f9-48e3-847a-b8daedd7edc4","Type":"ContainerStarted","Data":"2394edefe351bfc7ddcbf2c315541f324aefba1e9c9a1a6ba179d44242b99dc1"} Apr 16 14:32:32.997766 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:32.997742 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2c9qn" event={"ID":"04ae3980-8ad0-4077-87e4-d094e30cba62","Type":"ContainerStarted","Data":"2a3ea197816443d77d00f55f42e13172710ba29444ff57a6172483662f33c0c6"} Apr 16 14:32:33.335564 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:33.335518 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:32:34.001414 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:34.001381 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hqvsk" event={"ID":"88255980-f2f9-48e3-847a-b8daedd7edc4","Type":"ContainerStarted","Data":"d74624adb8cfff285eea79a28f90d2627cfd735a9d512459bdedb830d79eef28"} Apr 16 14:32:34.001414 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:34.001418 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hqvsk" event={"ID":"88255980-f2f9-48e3-847a-b8daedd7edc4","Type":"ContainerStarted","Data":"e1e54fa646be34dc33fd3fdaa80baa5413c6cf94e12f7b545ac0a772cbeadb53"} Apr 16 14:32:34.022070 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:34.022027 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hqvsk" podStartSLOduration=2.303815914 podStartE2EDuration="3.022012974s" podCreationTimestamp="2026-04-16 14:32:31 +0000 UTC" firstStartedPulling="2026-04-16 14:32:32.05616402 +0000 UTC m=+170.283893626" lastFinishedPulling="2026-04-16 14:32:32.774361079 +0000 UTC m=+171.002090686" observedRunningTime="2026-04-16 14:32:34.020843236 +0000 UTC m=+172.248572865" watchObservedRunningTime="2026-04-16 14:32:34.022012974 +0000 UTC m=+172.249742602" Apr 16 14:32:34.975349 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:34.975319 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w2ldg" Apr 16 14:32:35.005412 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:35.005373 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2c9qn" event={"ID":"04ae3980-8ad0-4077-87e4-d094e30cba62","Type":"ContainerStarted","Data":"2a95158e7355ecbb2bcc9481022599afb3d5ad20e422722480c79f90f037d001"} Apr 16 14:32:35.025043 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:35.024998 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2c9qn" podStartSLOduration=139.478662908 podStartE2EDuration="2m21.0249834s" podCreationTimestamp="2026-04-16 14:30:14 +0000 UTC" firstStartedPulling="2026-04-16 14:32:32.471037092 +0000 UTC m=+170.698766701" lastFinishedPulling="2026-04-16 14:32:34.017357587 +0000 UTC m=+172.245087193" observedRunningTime="2026-04-16 14:32:35.024228083 +0000 UTC m=+173.251957710" watchObservedRunningTime="2026-04-16 14:32:35.0249834 +0000 UTC m=+173.252713028" Apr 16 14:32:37.351324 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:37.351293 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g"] Apr 16 14:32:37.355390 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:37.355360 2563 patch_prober.go:28] interesting pod/image-registry-5dbc4dd5d5-8cm2g container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:32:37.355549 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:37.355417 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" podUID="ff4a3789-cd2d-43ee-8413-947a344bcef3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:32:47.355488 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:32:47.355457 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:33:02.369687 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.369633 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" podUID="ff4a3789-cd2d-43ee-8413-947a344bcef3" containerName="registry" containerID="cri-o://74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359" gracePeriod=30 Apr 16 14:33:02.598514 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.598491 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:33:02.649845 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.649783 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxwqn\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-kube-api-access-nxwqn\") pod \"ff4a3789-cd2d-43ee-8413-947a344bcef3\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " Apr 16 14:33:02.649845 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.649816 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") pod \"ff4a3789-cd2d-43ee-8413-947a344bcef3\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " Apr 16 14:33:02.650027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.649853 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff4a3789-cd2d-43ee-8413-947a344bcef3-installation-pull-secrets\") pod \"ff4a3789-cd2d-43ee-8413-947a344bcef3\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " Apr 16 14:33:02.650027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.649975 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff4a3789-cd2d-43ee-8413-947a344bcef3-image-registry-private-configuration\") pod \"ff4a3789-cd2d-43ee-8413-947a344bcef3\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " Apr 16 14:33:02.650027 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.650019 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3789-cd2d-43ee-8413-947a344bcef3-ca-trust-extracted\") pod \"ff4a3789-cd2d-43ee-8413-947a344bcef3\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " Apr 16 14:33:02.650192 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.650064 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-bound-sa-token\") pod \"ff4a3789-cd2d-43ee-8413-947a344bcef3\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " Apr 16 14:33:02.650192 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.650102 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff4a3789-cd2d-43ee-8413-947a344bcef3-trusted-ca\") pod \"ff4a3789-cd2d-43ee-8413-947a344bcef3\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " Apr 16 14:33:02.650192 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.650145 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-certificates\") pod \"ff4a3789-cd2d-43ee-8413-947a344bcef3\" (UID: \"ff4a3789-cd2d-43ee-8413-947a344bcef3\") " Apr 16 14:33:02.650759 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.650656 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ff4a3789-cd2d-43ee-8413-947a344bcef3" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:02.650927 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.650900 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff4a3789-cd2d-43ee-8413-947a344bcef3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ff4a3789-cd2d-43ee-8413-947a344bcef3" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:02.652353 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.652315 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-kube-api-access-nxwqn" (OuterVolumeSpecName: "kube-api-access-nxwqn") pod "ff4a3789-cd2d-43ee-8413-947a344bcef3" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3"). InnerVolumeSpecName "kube-api-access-nxwqn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:33:02.652481 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.652416 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ff4a3789-cd2d-43ee-8413-947a344bcef3" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:33:02.652481 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.652450 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4a3789-cd2d-43ee-8413-947a344bcef3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ff4a3789-cd2d-43ee-8413-947a344bcef3" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:02.652606 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.652560 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff4a3789-cd2d-43ee-8413-947a344bcef3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ff4a3789-cd2d-43ee-8413-947a344bcef3" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:02.652785 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.652761 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ff4a3789-cd2d-43ee-8413-947a344bcef3" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:33:02.659104 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.659077 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff4a3789-cd2d-43ee-8413-947a344bcef3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ff4a3789-cd2d-43ee-8413-947a344bcef3" (UID: "ff4a3789-cd2d-43ee-8413-947a344bcef3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:02.750650 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.750616 2563 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ff4a3789-cd2d-43ee-8413-947a344bcef3-installation-pull-secrets\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 16 14:33:02.750650 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.750650 2563 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ff4a3789-cd2d-43ee-8413-947a344bcef3-image-registry-private-configuration\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 16 14:33:02.750650 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.750662 2563 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ff4a3789-cd2d-43ee-8413-947a344bcef3-ca-trust-extracted\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 16 14:33:02.750849 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.750671 2563 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-bound-sa-token\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 16 14:33:02.750849 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.750680 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff4a3789-cd2d-43ee-8413-947a344bcef3-trusted-ca\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 16 14:33:02.750849 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.750688 2563 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-certificates\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 16 14:33:02.750849 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.750697 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nxwqn\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-kube-api-access-nxwqn\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 16 14:33:02.750849 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:02.750708 2563 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ff4a3789-cd2d-43ee-8413-947a344bcef3-registry-tls\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 16 14:33:03.082756 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:03.082715 2563 generic.go:358] "Generic (PLEG): container finished" podID="ff4a3789-cd2d-43ee-8413-947a344bcef3" containerID="74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359" exitCode=0 Apr 16 14:33:03.082919 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:03.082772 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" Apr 16 14:33:03.082919 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:03.082793 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" event={"ID":"ff4a3789-cd2d-43ee-8413-947a344bcef3","Type":"ContainerDied","Data":"74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359"} Apr 16 14:33:03.082919 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:03.082831 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g" event={"ID":"ff4a3789-cd2d-43ee-8413-947a344bcef3","Type":"ContainerDied","Data":"42885cb5509392f8995374632b936f83ae90aa4f7b3e35537d5daecacfdd000d"} Apr 16 14:33:03.082919 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:03.082847 2563 scope.go:117] "RemoveContainer" containerID="74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359" Apr 16 14:33:03.090929 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:03.090914 2563 scope.go:117] "RemoveContainer" containerID="74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359" Apr 16 14:33:03.091176 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:33:03.091158 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359\": container with ID starting with 74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359 not found: ID does not exist" containerID="74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359" Apr 16 14:33:03.091224 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:03.091183 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359"} err="failed to get container status \"74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359\": rpc error: code = NotFound desc = could not find container \"74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359\": container with ID starting with 74c680285920facc0b68cfa1aef819634cb4a17ebf19e8d6fa866f4381718359 not found: ID does not exist" Apr 16 14:33:03.104910 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:03.104872 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g"] Apr 16 14:33:03.109641 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:03.109619 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5dbc4dd5d5-8cm2g"] Apr 16 14:33:04.339088 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:04.339054 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff4a3789-cd2d-43ee-8413-947a344bcef3" path="/var/lib/kubelet/pods/ff4a3789-cd2d-43ee-8413-947a344bcef3/volumes" Apr 16 14:33:14.461188 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:14.461139 2563 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" podUID="804957af-e065-45ad-a33e-6ce7f5097eb3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:33:24.461344 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:24.461294 2563 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" podUID="804957af-e065-45ad-a33e-6ce7f5097eb3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:33:34.462130 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:34.462089 2563 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" podUID="804957af-e065-45ad-a33e-6ce7f5097eb3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:33:34.462504 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:34.462158 2563 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" Apr 16 14:33:34.462742 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:34.462721 2563 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"94b191e6a7b8f2119551a966016b02e0730bee8803754d699afd685942bbbdae"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 14:33:34.462798 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:34.462766 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" podUID="804957af-e065-45ad-a33e-6ce7f5097eb3" containerName="service-proxy" containerID="cri-o://94b191e6a7b8f2119551a966016b02e0730bee8803754d699afd685942bbbdae" gracePeriod=30 Apr 16 14:33:35.165350 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:35.165316 2563 generic.go:358] "Generic (PLEG): container finished" podID="804957af-e065-45ad-a33e-6ce7f5097eb3" containerID="94b191e6a7b8f2119551a966016b02e0730bee8803754d699afd685942bbbdae" exitCode=2 Apr 16 14:33:35.165350 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:35.165355 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" event={"ID":"804957af-e065-45ad-a33e-6ce7f5097eb3","Type":"ContainerDied","Data":"94b191e6a7b8f2119551a966016b02e0730bee8803754d699afd685942bbbdae"} Apr 16 14:33:35.165564 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:35.165378 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d6df67564-h5v4t" event={"ID":"804957af-e065-45ad-a33e-6ce7f5097eb3","Type":"ContainerStarted","Data":"48212ba9c128c7711efce69cf1b54c7a64ce0b5f318b520bebf532df5e3d641b"} Apr 16 14:33:53.215224 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:53.215187 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:33:53.217476 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:53.217450 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e7706545-6db6-4426-919c-bf83b5020047-metrics-certs\") pod \"network-metrics-daemon-9fx7w\" (UID: \"e7706545-6db6-4426-919c-bf83b5020047\") " pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:33:53.438754 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:53.438722 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bl9dn\"" Apr 16 14:33:53.447073 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:53.447054 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fx7w" Apr 16 14:33:53.561513 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:53.561483 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9fx7w"] Apr 16 14:33:53.564346 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:33:53.564318 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7706545_6db6_4426_919c_bf83b5020047.slice/crio-68c85ee54c0328bbc5cb9a0e0b95087a596767f4d20f36cbcc1173891be0da84 WatchSource:0}: Error finding container 68c85ee54c0328bbc5cb9a0e0b95087a596767f4d20f36cbcc1173891be0da84: Status 404 returned error can't find the container with id 68c85ee54c0328bbc5cb9a0e0b95087a596767f4d20f36cbcc1173891be0da84 Apr 16 14:33:54.219592 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:54.219521 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9fx7w" event={"ID":"e7706545-6db6-4426-919c-bf83b5020047","Type":"ContainerStarted","Data":"68c85ee54c0328bbc5cb9a0e0b95087a596767f4d20f36cbcc1173891be0da84"} Apr 16 14:33:55.223335 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:55.223301 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9fx7w" event={"ID":"e7706545-6db6-4426-919c-bf83b5020047","Type":"ContainerStarted","Data":"56dafa3f546a5323d5fb3f1a72041a3bf47aecfc4ecbb15d1acc06ee6f4f84bb"} Apr 16 14:33:55.223335 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:55.223335 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9fx7w" event={"ID":"e7706545-6db6-4426-919c-bf83b5020047","Type":"ContainerStarted","Data":"53ad81e149523ea93e67b346600cdb8a6ec537429b9b93a0abe7f9bea3080da6"} Apr 16 14:33:55.250132 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:33:55.250021 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9fx7w" podStartSLOduration=252.279985818 podStartE2EDuration="4m13.250005712s" podCreationTimestamp="2026-04-16 14:29:42 +0000 UTC" firstStartedPulling="2026-04-16 14:33:53.566166459 +0000 UTC m=+251.793896065" lastFinishedPulling="2026-04-16 14:33:54.536186353 +0000 UTC m=+252.763915959" observedRunningTime="2026-04-16 14:33:55.24962767 +0000 UTC m=+253.477357297" watchObservedRunningTime="2026-04-16 14:33:55.250005712 +0000 UTC m=+253.477735341" Apr 16 14:34:42.219011 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:34:42.218979 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:34:42.219567 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:34:42.219184 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:34:42.223712 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:34:42.223688 2563 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:37:41.240880 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.240838 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-nckt8"] Apr 16 14:37:41.241439 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.241141 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff4a3789-cd2d-43ee-8413-947a344bcef3" containerName="registry" Apr 16 14:37:41.241439 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.241160 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4a3789-cd2d-43ee-8413-947a344bcef3" containerName="registry" Apr 16 14:37:41.241439 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.241232 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff4a3789-cd2d-43ee-8413-947a344bcef3" containerName="registry" Apr 16 14:37:41.243167 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.243144 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-nckt8" Apr 16 14:37:41.244331 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.244306 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lczw\" (UniqueName: \"kubernetes.io/projected/9401953c-7def-49a1-8784-933713089d24-kube-api-access-5lczw\") pod \"cert-manager-cainjector-8966b78d4-nckt8\" (UID: \"9401953c-7def-49a1-8784-933713089d24\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nckt8" Apr 16 14:37:41.244432 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.244371 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9401953c-7def-49a1-8784-933713089d24-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-nckt8\" (UID: \"9401953c-7def-49a1-8784-933713089d24\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nckt8" Apr 16 14:37:41.245657 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.245633 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 14:37:41.245657 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.245647 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-t7twx\"" Apr 16 14:37:41.246411 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.246392 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 14:37:41.252652 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.252627 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-nckt8"] Apr 16 14:37:41.345081 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.345005 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lczw\" (UniqueName: \"kubernetes.io/projected/9401953c-7def-49a1-8784-933713089d24-kube-api-access-5lczw\") pod \"cert-manager-cainjector-8966b78d4-nckt8\" (UID: \"9401953c-7def-49a1-8784-933713089d24\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nckt8" Apr 16 14:37:41.345226 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.345144 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9401953c-7def-49a1-8784-933713089d24-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-nckt8\" (UID: \"9401953c-7def-49a1-8784-933713089d24\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nckt8" Apr 16 14:37:41.353990 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.353961 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9401953c-7def-49a1-8784-933713089d24-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-nckt8\" (UID: \"9401953c-7def-49a1-8784-933713089d24\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nckt8" Apr 16 14:37:41.354140 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.354122 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lczw\" (UniqueName: \"kubernetes.io/projected/9401953c-7def-49a1-8784-933713089d24-kube-api-access-5lczw\") pod \"cert-manager-cainjector-8966b78d4-nckt8\" (UID: \"9401953c-7def-49a1-8784-933713089d24\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-nckt8" Apr 16 14:37:41.551922 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.551838 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-nckt8" Apr 16 14:37:41.666003 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.665968 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-nckt8"] Apr 16 14:37:41.669267 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:37:41.669238 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9401953c_7def_49a1_8784_933713089d24.slice/crio-bc529bbc08c57f1cbb571debc214e59d97112ab7ab36e762e30615b09be3e671 WatchSource:0}: Error finding container bc529bbc08c57f1cbb571debc214e59d97112ab7ab36e762e30615b09be3e671: Status 404 returned error can't find the container with id bc529bbc08c57f1cbb571debc214e59d97112ab7ab36e762e30615b09be3e671 Apr 16 14:37:41.671178 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.671159 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:37:41.783320 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:41.783282 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-nckt8" event={"ID":"9401953c-7def-49a1-8784-933713089d24","Type":"ContainerStarted","Data":"bc529bbc08c57f1cbb571debc214e59d97112ab7ab36e762e30615b09be3e671"} Apr 16 14:37:45.795728 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:45.795688 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-nckt8" event={"ID":"9401953c-7def-49a1-8784-933713089d24","Type":"ContainerStarted","Data":"089e21e2dfaa162269ebb81e51c8a5a5ebb3e656298b08f973b50fa8bfd9b2a7"} Apr 16 14:37:45.812456 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:45.812417 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-nckt8" podStartSLOduration=1.726579445 podStartE2EDuration="4.812402299s" podCreationTimestamp="2026-04-16 14:37:41 +0000 UTC" firstStartedPulling="2026-04-16 14:37:41.671283876 +0000 UTC m=+479.899013486" lastFinishedPulling="2026-04-16 14:37:44.757106732 +0000 UTC m=+482.984836340" observedRunningTime="2026-04-16 14:37:45.811805116 +0000 UTC m=+484.039534755" watchObservedRunningTime="2026-04-16 14:37:45.812402299 +0000 UTC m=+484.040131927" Apr 16 14:37:57.605745 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:57.605710 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-s8f75"] Apr 16 14:37:57.607944 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:57.607927 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-s8f75" Apr 16 14:37:57.610163 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:57.610143 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-2jk7q\"" Apr 16 14:37:57.616160 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:57.616128 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-s8f75"] Apr 16 14:37:57.655982 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:57.655950 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c10bdc11-47f4-4ae3-b9fb-6e9a98569385-bound-sa-token\") pod \"cert-manager-759f64656b-s8f75\" (UID: \"c10bdc11-47f4-4ae3-b9fb-6e9a98569385\") " pod="cert-manager/cert-manager-759f64656b-s8f75" Apr 16 14:37:57.656128 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:57.656005 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxlkl\" (UniqueName: \"kubernetes.io/projected/c10bdc11-47f4-4ae3-b9fb-6e9a98569385-kube-api-access-bxlkl\") pod \"cert-manager-759f64656b-s8f75\" (UID: \"c10bdc11-47f4-4ae3-b9fb-6e9a98569385\") " pod="cert-manager/cert-manager-759f64656b-s8f75" Apr 16 14:37:57.756922 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:57.756891 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c10bdc11-47f4-4ae3-b9fb-6e9a98569385-bound-sa-token\") pod \"cert-manager-759f64656b-s8f75\" (UID: \"c10bdc11-47f4-4ae3-b9fb-6e9a98569385\") " pod="cert-manager/cert-manager-759f64656b-s8f75" Apr 16 14:37:57.757061 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:57.756944 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxlkl\" (UniqueName: \"kubernetes.io/projected/c10bdc11-47f4-4ae3-b9fb-6e9a98569385-kube-api-access-bxlkl\") pod \"cert-manager-759f64656b-s8f75\" (UID: \"c10bdc11-47f4-4ae3-b9fb-6e9a98569385\") " pod="cert-manager/cert-manager-759f64656b-s8f75" Apr 16 14:37:57.766824 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:57.766794 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxlkl\" (UniqueName: \"kubernetes.io/projected/c10bdc11-47f4-4ae3-b9fb-6e9a98569385-kube-api-access-bxlkl\") pod \"cert-manager-759f64656b-s8f75\" (UID: \"c10bdc11-47f4-4ae3-b9fb-6e9a98569385\") " pod="cert-manager/cert-manager-759f64656b-s8f75" Apr 16 14:37:57.766933 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:57.766821 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c10bdc11-47f4-4ae3-b9fb-6e9a98569385-bound-sa-token\") pod \"cert-manager-759f64656b-s8f75\" (UID: \"c10bdc11-47f4-4ae3-b9fb-6e9a98569385\") " pod="cert-manager/cert-manager-759f64656b-s8f75" Apr 16 14:37:57.917421 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:57.917347 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-s8f75" Apr 16 14:37:58.030653 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:58.030620 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-s8f75"] Apr 16 14:37:58.033578 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:37:58.033517 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc10bdc11_47f4_4ae3_b9fb_6e9a98569385.slice/crio-55905d155e0632246365320e63939982acc27093728ee2296bc1500f3c6a6fd2 WatchSource:0}: Error finding container 55905d155e0632246365320e63939982acc27093728ee2296bc1500f3c6a6fd2: Status 404 returned error can't find the container with id 55905d155e0632246365320e63939982acc27093728ee2296bc1500f3c6a6fd2 Apr 16 14:37:58.833510 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:58.833477 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-s8f75" event={"ID":"c10bdc11-47f4-4ae3-b9fb-6e9a98569385","Type":"ContainerStarted","Data":"16544439bc9ba43d7b015d6696f1510170bb53d05fd34c4c91c6b925b4d3bbbc"} Apr 16 14:37:58.833510 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:58.833513 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-s8f75" event={"ID":"c10bdc11-47f4-4ae3-b9fb-6e9a98569385","Type":"ContainerStarted","Data":"55905d155e0632246365320e63939982acc27093728ee2296bc1500f3c6a6fd2"} Apr 16 14:37:58.851352 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:37:58.850885 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-s8f75" podStartSLOduration=1.850868412 podStartE2EDuration="1.850868412s" podCreationTimestamp="2026-04-16 14:37:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:37:58.850197742 +0000 UTC m=+497.077927375" watchObservedRunningTime="2026-04-16 14:37:58.850868412 +0000 UTC m=+497.078598041" Apr 16 14:38:10.662818 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.662779 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff"] Apr 16 14:38:10.665867 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.665845 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:10.668158 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.668136 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 14:38:10.668289 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.668182 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 14:38:10.668410 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.668392 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 14:38:10.668487 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.668421 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-rhxrn\"" Apr 16 14:38:10.668555 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.668496 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 14:38:10.678412 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.678388 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff"] Apr 16 14:38:10.748316 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.748287 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/223bc827-900c-429e-a82b-27b6770f3dd4-webhook-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-466ff\" (UID: \"223bc827-900c-429e-a82b-27b6770f3dd4\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:10.748316 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.748328 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2rjr\" (UniqueName: \"kubernetes.io/projected/223bc827-900c-429e-a82b-27b6770f3dd4-kube-api-access-z2rjr\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-466ff\" (UID: \"223bc827-900c-429e-a82b-27b6770f3dd4\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:10.748565 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.748360 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/223bc827-900c-429e-a82b-27b6770f3dd4-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-466ff\" (UID: \"223bc827-900c-429e-a82b-27b6770f3dd4\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:10.849336 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.849306 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/223bc827-900c-429e-a82b-27b6770f3dd4-webhook-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-466ff\" (UID: \"223bc827-900c-429e-a82b-27b6770f3dd4\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:10.849484 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.849343 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2rjr\" (UniqueName: \"kubernetes.io/projected/223bc827-900c-429e-a82b-27b6770f3dd4-kube-api-access-z2rjr\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-466ff\" (UID: \"223bc827-900c-429e-a82b-27b6770f3dd4\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:10.849484 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.849467 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/223bc827-900c-429e-a82b-27b6770f3dd4-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-466ff\" (UID: \"223bc827-900c-429e-a82b-27b6770f3dd4\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:10.851722 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.851694 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/223bc827-900c-429e-a82b-27b6770f3dd4-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-466ff\" (UID: \"223bc827-900c-429e-a82b-27b6770f3dd4\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:10.851847 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.851790 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/223bc827-900c-429e-a82b-27b6770f3dd4-webhook-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-466ff\" (UID: \"223bc827-900c-429e-a82b-27b6770f3dd4\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:10.860653 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.860631 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2rjr\" (UniqueName: \"kubernetes.io/projected/223bc827-900c-429e-a82b-27b6770f3dd4-kube-api-access-z2rjr\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-466ff\" (UID: \"223bc827-900c-429e-a82b-27b6770f3dd4\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:10.976067 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:10.975997 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:11.109214 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:11.109186 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff"] Apr 16 14:38:11.111925 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:38:11.111898 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod223bc827_900c_429e_a82b_27b6770f3dd4.slice/crio-318a1f510890c44b6c00ede9e0d1aacf43966285f739788eebd21ec986bfe179 WatchSource:0}: Error finding container 318a1f510890c44b6c00ede9e0d1aacf43966285f739788eebd21ec986bfe179: Status 404 returned error can't find the container with id 318a1f510890c44b6c00ede9e0d1aacf43966285f739788eebd21ec986bfe179 Apr 16 14:38:11.869729 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:11.869684 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" event={"ID":"223bc827-900c-429e-a82b-27b6770f3dd4","Type":"ContainerStarted","Data":"318a1f510890c44b6c00ede9e0d1aacf43966285f739788eebd21ec986bfe179"} Apr 16 14:38:13.875987 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:13.875897 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" event={"ID":"223bc827-900c-429e-a82b-27b6770f3dd4","Type":"ContainerStarted","Data":"37167152a720c89b8169a4d178ed40e3ff838f70771d4f2416c7f3b6e080e3c0"} Apr 16 14:38:13.876389 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:13.876042 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:13.904449 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:13.904395 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" podStartSLOduration=1.434807382 podStartE2EDuration="3.904384925s" podCreationTimestamp="2026-04-16 14:38:10 +0000 UTC" firstStartedPulling="2026-04-16 14:38:11.113604243 +0000 UTC m=+509.341333850" lastFinishedPulling="2026-04-16 14:38:13.583181787 +0000 UTC m=+511.810911393" observedRunningTime="2026-04-16 14:38:13.90387109 +0000 UTC m=+512.131600718" watchObservedRunningTime="2026-04-16 14:38:13.904384925 +0000 UTC m=+512.132114552" Apr 16 14:38:24.881116 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:24.881085 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-466ff" Apr 16 14:38:29.403485 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.403453 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh"] Apr 16 14:38:29.410235 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.410210 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" Apr 16 14:38:29.412855 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.412831 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 14:38:29.413831 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.413804 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:38:29.414192 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.414038 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 14:38:29.414192 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.414075 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-5mw66\"" Apr 16 14:38:29.414370 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.414266 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:38:29.415233 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.415213 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh"] Apr 16 14:38:29.494996 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.494963 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtrr5\" (UniqueName: \"kubernetes.io/projected/449b2fdb-e79e-488a-8b89-992acf1c125f-kube-api-access-vtrr5\") pod \"kube-auth-proxy-6468c4fc75-ncjnh\" (UID: \"449b2fdb-e79e-488a-8b89-992acf1c125f\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" Apr 16 14:38:29.495133 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.495031 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/449b2fdb-e79e-488a-8b89-992acf1c125f-tls-certs\") pod \"kube-auth-proxy-6468c4fc75-ncjnh\" (UID: \"449b2fdb-e79e-488a-8b89-992acf1c125f\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" Apr 16 14:38:29.495133 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.495048 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/449b2fdb-e79e-488a-8b89-992acf1c125f-tmp\") pod \"kube-auth-proxy-6468c4fc75-ncjnh\" (UID: \"449b2fdb-e79e-488a-8b89-992acf1c125f\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" Apr 16 14:38:29.595443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.595410 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/449b2fdb-e79e-488a-8b89-992acf1c125f-tls-certs\") pod \"kube-auth-proxy-6468c4fc75-ncjnh\" (UID: \"449b2fdb-e79e-488a-8b89-992acf1c125f\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" Apr 16 14:38:29.595443 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.595443 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/449b2fdb-e79e-488a-8b89-992acf1c125f-tmp\") pod \"kube-auth-proxy-6468c4fc75-ncjnh\" (UID: \"449b2fdb-e79e-488a-8b89-992acf1c125f\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" Apr 16 14:38:29.595644 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.595474 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtrr5\" (UniqueName: \"kubernetes.io/projected/449b2fdb-e79e-488a-8b89-992acf1c125f-kube-api-access-vtrr5\") pod \"kube-auth-proxy-6468c4fc75-ncjnh\" (UID: \"449b2fdb-e79e-488a-8b89-992acf1c125f\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" Apr 16 14:38:29.597687 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.597669 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/449b2fdb-e79e-488a-8b89-992acf1c125f-tmp\") pod \"kube-auth-proxy-6468c4fc75-ncjnh\" (UID: \"449b2fdb-e79e-488a-8b89-992acf1c125f\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" Apr 16 14:38:29.597901 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.597885 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/449b2fdb-e79e-488a-8b89-992acf1c125f-tls-certs\") pod \"kube-auth-proxy-6468c4fc75-ncjnh\" (UID: \"449b2fdb-e79e-488a-8b89-992acf1c125f\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" Apr 16 14:38:29.605380 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.605339 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtrr5\" (UniqueName: \"kubernetes.io/projected/449b2fdb-e79e-488a-8b89-992acf1c125f-kube-api-access-vtrr5\") pod \"kube-auth-proxy-6468c4fc75-ncjnh\" (UID: \"449b2fdb-e79e-488a-8b89-992acf1c125f\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" Apr 16 14:38:29.720619 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.720540 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" Apr 16 14:38:29.837427 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.837397 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh"] Apr 16 14:38:29.840606 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:38:29.840578 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod449b2fdb_e79e_488a_8b89_992acf1c125f.slice/crio-de78d7921611e288e78d7bc15b6d94048869f3d3959f9d92f8b0bad02e8194f7 WatchSource:0}: Error finding container de78d7921611e288e78d7bc15b6d94048869f3d3959f9d92f8b0bad02e8194f7: Status 404 returned error can't find the container with id de78d7921611e288e78d7bc15b6d94048869f3d3959f9d92f8b0bad02e8194f7 Apr 16 14:38:29.920012 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:29.919980 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" event={"ID":"449b2fdb-e79e-488a-8b89-992acf1c125f","Type":"ContainerStarted","Data":"de78d7921611e288e78d7bc15b6d94048869f3d3959f9d92f8b0bad02e8194f7"} Apr 16 14:38:33.933168 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:33.933131 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" event={"ID":"449b2fdb-e79e-488a-8b89-992acf1c125f","Type":"ContainerStarted","Data":"b6bbb05ec5cc065523d044cd0f4edf69a181987fc5d0ca7fb1dfa2761a8bd8c9"} Apr 16 14:38:33.951890 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:33.951844 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6468c4fc75-ncjnh" podStartSLOduration=1.742851325 podStartE2EDuration="4.951831061s" podCreationTimestamp="2026-04-16 14:38:29 +0000 UTC" firstStartedPulling="2026-04-16 14:38:29.842275982 +0000 UTC m=+528.070005588" lastFinishedPulling="2026-04-16 14:38:33.051255702 +0000 UTC m=+531.278985324" observedRunningTime="2026-04-16 14:38:33.950187915 +0000 UTC m=+532.177917543" watchObservedRunningTime="2026-04-16 14:38:33.951831061 +0000 UTC m=+532.179560728" Apr 16 14:38:38.171126 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:38.171087 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-r6dd9"] Apr 16 14:38:38.174437 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:38.174414 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" Apr 16 14:38:38.177060 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:38.177034 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 14:38:38.177166 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:38.177066 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-nsgls\"" Apr 16 14:38:38.193015 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:38.192984 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-r6dd9"] Apr 16 14:38:38.263288 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:38.263260 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea12b07-d666-4d0a-97dd-20214f879591-cert\") pod \"kserve-controller-manager-856948b99f-r6dd9\" (UID: \"fea12b07-d666-4d0a-97dd-20214f879591\") " pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" Apr 16 14:38:38.263419 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:38.263311 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mp9q\" (UniqueName: \"kubernetes.io/projected/fea12b07-d666-4d0a-97dd-20214f879591-kube-api-access-7mp9q\") pod \"kserve-controller-manager-856948b99f-r6dd9\" (UID: \"fea12b07-d666-4d0a-97dd-20214f879591\") " pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" Apr 16 14:38:38.363900 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:38.363872 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea12b07-d666-4d0a-97dd-20214f879591-cert\") pod \"kserve-controller-manager-856948b99f-r6dd9\" (UID: \"fea12b07-d666-4d0a-97dd-20214f879591\") " pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" Apr 16 14:38:38.364079 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:38.363919 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mp9q\" (UniqueName: \"kubernetes.io/projected/fea12b07-d666-4d0a-97dd-20214f879591-kube-api-access-7mp9q\") pod \"kserve-controller-manager-856948b99f-r6dd9\" (UID: \"fea12b07-d666-4d0a-97dd-20214f879591\") " pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" Apr 16 14:38:38.364079 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:38:38.364030 2563 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 14:38:38.364199 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:38:38.364112 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fea12b07-d666-4d0a-97dd-20214f879591-cert podName:fea12b07-d666-4d0a-97dd-20214f879591 nodeName:}" failed. No retries permitted until 2026-04-16 14:38:38.864089396 +0000 UTC m=+537.091819002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fea12b07-d666-4d0a-97dd-20214f879591-cert") pod "kserve-controller-manager-856948b99f-r6dd9" (UID: "fea12b07-d666-4d0a-97dd-20214f879591") : secret "kserve-webhook-server-cert" not found Apr 16 14:38:38.375992 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:38.375959 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mp9q\" (UniqueName: \"kubernetes.io/projected/fea12b07-d666-4d0a-97dd-20214f879591-kube-api-access-7mp9q\") pod \"kserve-controller-manager-856948b99f-r6dd9\" (UID: \"fea12b07-d666-4d0a-97dd-20214f879591\") " pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" Apr 16 14:38:38.868196 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:38.868163 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea12b07-d666-4d0a-97dd-20214f879591-cert\") pod \"kserve-controller-manager-856948b99f-r6dd9\" (UID: \"fea12b07-d666-4d0a-97dd-20214f879591\") " pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" Apr 16 14:38:38.870417 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:38.870388 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fea12b07-d666-4d0a-97dd-20214f879591-cert\") pod \"kserve-controller-manager-856948b99f-r6dd9\" (UID: \"fea12b07-d666-4d0a-97dd-20214f879591\") " pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" Apr 16 14:38:39.085659 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:39.085626 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" Apr 16 14:38:39.206600 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:39.206573 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-r6dd9"] Apr 16 14:38:39.209149 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:38:39.209118 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfea12b07_d666_4d0a_97dd_20214f879591.slice/crio-8899c5d2a8158cec3184f632110d1156e3db51201242d55ce225cc559f81d3d6 WatchSource:0}: Error finding container 8899c5d2a8158cec3184f632110d1156e3db51201242d55ce225cc559f81d3d6: Status 404 returned error can't find the container with id 8899c5d2a8158cec3184f632110d1156e3db51201242d55ce225cc559f81d3d6 Apr 16 14:38:39.949650 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:39.949610 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" event={"ID":"fea12b07-d666-4d0a-97dd-20214f879591","Type":"ContainerStarted","Data":"8899c5d2a8158cec3184f632110d1156e3db51201242d55ce225cc559f81d3d6"} Apr 16 14:38:41.958237 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:41.958202 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" event={"ID":"fea12b07-d666-4d0a-97dd-20214f879591","Type":"ContainerStarted","Data":"61e30905a67a510e0fdc0dd98b37f0d87462866dfb85c42dad0b6c3ba690ffb6"} Apr 16 14:38:41.958588 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:41.958356 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" Apr 16 14:38:41.978202 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:41.978157 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" podStartSLOduration=1.31363676 podStartE2EDuration="3.978143863s" podCreationTimestamp="2026-04-16 14:38:38 +0000 UTC" firstStartedPulling="2026-04-16 14:38:39.210439679 +0000 UTC m=+537.438169286" lastFinishedPulling="2026-04-16 14:38:41.874946783 +0000 UTC m=+540.102676389" observedRunningTime="2026-04-16 14:38:41.977174782 +0000 UTC m=+540.204904423" watchObservedRunningTime="2026-04-16 14:38:41.978143863 +0000 UTC m=+540.205873491" Apr 16 14:38:47.122515 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.122480 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k"] Apr 16 14:38:47.130966 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.130946 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" Apr 16 14:38:47.135763 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.135739 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-gjlpt\"" Apr 16 14:38:47.135902 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.135768 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 14:38:47.136123 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.136103 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 14:38:47.140826 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.140803 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k"] Apr 16 14:38:47.234974 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.234941 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsds2\" (UniqueName: \"kubernetes.io/projected/642005e1-f5ff-488b-bf43-1fd18130e985-kube-api-access-dsds2\") pod \"servicemesh-operator3-55f49c5f94-v5s4k\" (UID: \"642005e1-f5ff-488b-bf43-1fd18130e985\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" Apr 16 14:38:47.235135 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.234983 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/642005e1-f5ff-488b-bf43-1fd18130e985-operator-config\") pod \"servicemesh-operator3-55f49c5f94-v5s4k\" (UID: \"642005e1-f5ff-488b-bf43-1fd18130e985\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" Apr 16 14:38:47.335620 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.335581 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/642005e1-f5ff-488b-bf43-1fd18130e985-operator-config\") pod \"servicemesh-operator3-55f49c5f94-v5s4k\" (UID: \"642005e1-f5ff-488b-bf43-1fd18130e985\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" Apr 16 14:38:47.335729 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.335678 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsds2\" (UniqueName: \"kubernetes.io/projected/642005e1-f5ff-488b-bf43-1fd18130e985-kube-api-access-dsds2\") pod \"servicemesh-operator3-55f49c5f94-v5s4k\" (UID: \"642005e1-f5ff-488b-bf43-1fd18130e985\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" Apr 16 14:38:47.338020 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.337990 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/642005e1-f5ff-488b-bf43-1fd18130e985-operator-config\") pod \"servicemesh-operator3-55f49c5f94-v5s4k\" (UID: \"642005e1-f5ff-488b-bf43-1fd18130e985\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" Apr 16 14:38:47.344988 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.344964 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsds2\" (UniqueName: \"kubernetes.io/projected/642005e1-f5ff-488b-bf43-1fd18130e985-kube-api-access-dsds2\") pod \"servicemesh-operator3-55f49c5f94-v5s4k\" (UID: \"642005e1-f5ff-488b-bf43-1fd18130e985\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" Apr 16 14:38:47.440174 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.440091 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" Apr 16 14:38:47.556999 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.556966 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k"] Apr 16 14:38:47.560873 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:38:47.560845 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod642005e1_f5ff_488b_bf43_1fd18130e985.slice/crio-f748509ced6e6f826002df7df0bed763df79fd6b737f4e71762bbde32001084d WatchSource:0}: Error finding container f748509ced6e6f826002df7df0bed763df79fd6b737f4e71762bbde32001084d: Status 404 returned error can't find the container with id f748509ced6e6f826002df7df0bed763df79fd6b737f4e71762bbde32001084d Apr 16 14:38:47.974261 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:47.974226 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" event={"ID":"642005e1-f5ff-488b-bf43-1fd18130e985","Type":"ContainerStarted","Data":"f748509ced6e6f826002df7df0bed763df79fd6b737f4e71762bbde32001084d"} Apr 16 14:38:49.984194 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:49.984153 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" event={"ID":"642005e1-f5ff-488b-bf43-1fd18130e985","Type":"ContainerStarted","Data":"151037ebe3a465a1a5078fb01ca93db6f25af9f7dca62f5ee24f83d099252348"} Apr 16 14:38:49.984873 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:49.984290 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" Apr 16 14:38:50.006874 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:50.006765 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" podStartSLOduration=0.832342585 podStartE2EDuration="3.006752131s" podCreationTimestamp="2026-04-16 14:38:47 +0000 UTC" firstStartedPulling="2026-04-16 14:38:47.563208151 +0000 UTC m=+545.790937760" lastFinishedPulling="2026-04-16 14:38:49.737617697 +0000 UTC m=+547.965347306" observedRunningTime="2026-04-16 14:38:50.004835776 +0000 UTC m=+548.232565431" watchObservedRunningTime="2026-04-16 14:38:50.006752131 +0000 UTC m=+548.234481759" Apr 16 14:38:56.928696 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:56.928662 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4"] Apr 16 14:38:56.932193 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:56.932174 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:56.934805 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:56.934778 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 14:38:56.934805 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:56.934801 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 14:38:56.934995 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:56.934818 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-pmmjk\"" Apr 16 14:38:56.934995 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:56.934820 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 14:38:56.934995 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:56.934801 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 14:38:56.942841 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:56.942820 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4"] Apr 16 14:38:57.005247 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.005214 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/3e10e903-89e1-4380-b705-4f81f81ff239-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.005247 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.005250 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-976b9\" (UniqueName: \"kubernetes.io/projected/3e10e903-89e1-4380-b705-4f81f81ff239-kube-api-access-976b9\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.005446 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.005298 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3e10e903-89e1-4380-b705-4f81f81ff239-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.005446 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.005336 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3e10e903-89e1-4380-b705-4f81f81ff239-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.005446 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.005353 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/3e10e903-89e1-4380-b705-4f81f81ff239-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.005446 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.005370 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/3e10e903-89e1-4380-b705-4f81f81ff239-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.005446 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.005402 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/3e10e903-89e1-4380-b705-4f81f81ff239-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.106666 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.106637 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/3e10e903-89e1-4380-b705-4f81f81ff239-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.106666 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.106668 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-976b9\" (UniqueName: \"kubernetes.io/projected/3e10e903-89e1-4380-b705-4f81f81ff239-kube-api-access-976b9\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.106892 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.106694 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3e10e903-89e1-4380-b705-4f81f81ff239-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.106892 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.106729 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3e10e903-89e1-4380-b705-4f81f81ff239-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.106892 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.106755 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/3e10e903-89e1-4380-b705-4f81f81ff239-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.106892 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.106782 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/3e10e903-89e1-4380-b705-4f81f81ff239-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.107108 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.107029 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/3e10e903-89e1-4380-b705-4f81f81ff239-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.107436 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.107397 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/3e10e903-89e1-4380-b705-4f81f81ff239-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.109129 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.109099 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/3e10e903-89e1-4380-b705-4f81f81ff239-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.109485 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.109455 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3e10e903-89e1-4380-b705-4f81f81ff239-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.109591 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.109498 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/3e10e903-89e1-4380-b705-4f81f81ff239-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.109591 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.109564 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/3e10e903-89e1-4380-b705-4f81f81ff239-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.115151 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.115130 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3e10e903-89e1-4380-b705-4f81f81ff239-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.115497 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.115480 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-976b9\" (UniqueName: \"kubernetes.io/projected/3e10e903-89e1-4380-b705-4f81f81ff239-kube-api-access-976b9\") pod \"istiod-openshift-gateway-55ff986f96-42nv4\" (UID: \"3e10e903-89e1-4380-b705-4f81f81ff239\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.242346 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.241872 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:38:57.368349 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:57.368318 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4"] Apr 16 14:38:57.371190 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:38:57.371158 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e10e903_89e1_4380_b705_4f81f81ff239.slice/crio-e7ed3153dab8b27ab79580216311bb8381f2f8eedcce38e5e80c3e968b1a6e0d WatchSource:0}: Error finding container e7ed3153dab8b27ab79580216311bb8381f2f8eedcce38e5e80c3e968b1a6e0d: Status 404 returned error can't find the container with id e7ed3153dab8b27ab79580216311bb8381f2f8eedcce38e5e80c3e968b1a6e0d Apr 16 14:38:58.008843 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:58.008808 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" event={"ID":"3e10e903-89e1-4380-b705-4f81f81ff239","Type":"ContainerStarted","Data":"e7ed3153dab8b27ab79580216311bb8381f2f8eedcce38e5e80c3e968b1a6e0d"} Apr 16 14:38:59.741074 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:59.741030 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 16 14:38:59.741345 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:38:59.741119 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 16 14:39:00.016600 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:39:00.016489 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" event={"ID":"3e10e903-89e1-4380-b705-4f81f81ff239","Type":"ContainerStarted","Data":"93c2dfaa13e60c6d2a58ad3fae2702ad4a09eaecc98fff06355958dfcc8684e5"} Apr 16 14:39:00.016748 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:39:00.016697 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:39:00.018412 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:39:00.018382 2563 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-42nv4 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 14:39:00.018562 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:39:00.018432 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" podUID="3e10e903-89e1-4380-b705-4f81f81ff239" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:39:00.040064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:39:00.040020 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" podStartSLOduration=1.672263371 podStartE2EDuration="4.040006148s" podCreationTimestamp="2026-04-16 14:38:56 +0000 UTC" firstStartedPulling="2026-04-16 14:38:57.373010025 +0000 UTC m=+555.600739631" lastFinishedPulling="2026-04-16 14:38:59.740752788 +0000 UTC m=+557.968482408" observedRunningTime="2026-04-16 14:39:00.038571217 +0000 UTC m=+558.266300836" watchObservedRunningTime="2026-04-16 14:39:00.040006148 +0000 UTC m=+558.267735778" Apr 16 14:39:00.989490 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:39:00.989458 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v5s4k" Apr 16 14:39:01.020679 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:39:01.020638 2563 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-42nv4 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 14:39:01.020833 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:39:01.020697 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" podUID="3e10e903-89e1-4380-b705-4f81f81ff239" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:39:04.021148 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:39:04.021113 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-42nv4" Apr 16 14:39:12.966105 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:39:12.966076 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-r6dd9" Apr 16 14:39:42.238546 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:39:42.238512 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:39:42.239052 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:39:42.238972 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:40:06.901235 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:06.901157 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-txt6f"] Apr 16 14:40:06.904133 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:06.904117 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-txt6f" Apr 16 14:40:06.906719 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:06.906697 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 14:40:06.906818 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:06.906775 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-2b65n\"" Apr 16 14:40:06.907587 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:06.907565 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 14:40:06.915544 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:06.915510 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-txt6f"] Apr 16 14:40:07.043176 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:07.043137 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp59w\" (UniqueName: \"kubernetes.io/projected/542f11da-42dc-40c9-b9a5-5fb456f554b9-kube-api-access-jp59w\") pod \"authorino-operator-657f44b778-txt6f\" (UID: \"542f11da-42dc-40c9-b9a5-5fb456f554b9\") " pod="kuadrant-system/authorino-operator-657f44b778-txt6f" Apr 16 14:40:07.143497 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:07.143461 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp59w\" (UniqueName: \"kubernetes.io/projected/542f11da-42dc-40c9-b9a5-5fb456f554b9-kube-api-access-jp59w\") pod \"authorino-operator-657f44b778-txt6f\" (UID: \"542f11da-42dc-40c9-b9a5-5fb456f554b9\") " pod="kuadrant-system/authorino-operator-657f44b778-txt6f" Apr 16 14:40:07.158038 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:07.157972 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp59w\" (UniqueName: \"kubernetes.io/projected/542f11da-42dc-40c9-b9a5-5fb456f554b9-kube-api-access-jp59w\") pod \"authorino-operator-657f44b778-txt6f\" (UID: \"542f11da-42dc-40c9-b9a5-5fb456f554b9\") " pod="kuadrant-system/authorino-operator-657f44b778-txt6f" Apr 16 14:40:07.214600 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:07.214571 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-txt6f" Apr 16 14:40:07.338912 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:07.338876 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-txt6f"] Apr 16 14:40:07.341790 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:40:07.341759 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod542f11da_42dc_40c9_b9a5_5fb456f554b9.slice/crio-01f23e3246ff4cfcaeb019e0001da4d96f3d39a6fc48685c75760d63e8e50229 WatchSource:0}: Error finding container 01f23e3246ff4cfcaeb019e0001da4d96f3d39a6fc48685c75760d63e8e50229: Status 404 returned error can't find the container with id 01f23e3246ff4cfcaeb019e0001da4d96f3d39a6fc48685c75760d63e8e50229 Apr 16 14:40:08.224327 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:08.224294 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-txt6f" event={"ID":"542f11da-42dc-40c9-b9a5-5fb456f554b9","Type":"ContainerStarted","Data":"01f23e3246ff4cfcaeb019e0001da4d96f3d39a6fc48685c75760d63e8e50229"} Apr 16 14:40:09.228717 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:09.228680 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-txt6f" event={"ID":"542f11da-42dc-40c9-b9a5-5fb456f554b9","Type":"ContainerStarted","Data":"12bbb5b2ead6b17a8e6b9ed673b27d489e305fad05ee4b2af2dbb07820e9f0ee"} Apr 16 14:40:09.229148 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:09.228796 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-txt6f" Apr 16 14:40:09.246962 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:09.246909 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-txt6f" podStartSLOduration=1.567373576 podStartE2EDuration="3.24688516s" podCreationTimestamp="2026-04-16 14:40:06 +0000 UTC" firstStartedPulling="2026-04-16 14:40:07.343743245 +0000 UTC m=+625.571472851" lastFinishedPulling="2026-04-16 14:40:09.023254825 +0000 UTC m=+627.250984435" observedRunningTime="2026-04-16 14:40:09.245819439 +0000 UTC m=+627.473549095" watchObservedRunningTime="2026-04-16 14:40:09.24688516 +0000 UTC m=+627.474614791" Apr 16 14:40:10.348070 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:10.348042 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m"] Apr 16 14:40:10.351209 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:10.351189 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m" Apr 16 14:40:10.353725 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:10.353705 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-l6ks2\"" Apr 16 14:40:10.353813 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:10.353705 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 14:40:10.364064 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:10.364043 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m"] Apr 16 14:40:10.370778 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:10.370751 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh7vm\" (UniqueName: \"kubernetes.io/projected/5d62a358-c857-4dad-ba5d-ae3696fe87cd-kube-api-access-hh7vm\") pod \"dns-operator-controller-manager-648d5c98bc-gv82m\" (UID: \"5d62a358-c857-4dad-ba5d-ae3696fe87cd\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m" Apr 16 14:40:10.472100 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:10.472064 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hh7vm\" (UniqueName: \"kubernetes.io/projected/5d62a358-c857-4dad-ba5d-ae3696fe87cd-kube-api-access-hh7vm\") pod \"dns-operator-controller-manager-648d5c98bc-gv82m\" (UID: \"5d62a358-c857-4dad-ba5d-ae3696fe87cd\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m" Apr 16 14:40:10.480991 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:10.480964 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh7vm\" (UniqueName: \"kubernetes.io/projected/5d62a358-c857-4dad-ba5d-ae3696fe87cd-kube-api-access-hh7vm\") pod \"dns-operator-controller-manager-648d5c98bc-gv82m\" (UID: \"5d62a358-c857-4dad-ba5d-ae3696fe87cd\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m" Apr 16 14:40:10.661024 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:10.660948 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m" Apr 16 14:40:10.778844 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:10.778820 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m"] Apr 16 14:40:10.781801 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:40:10.781777 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d62a358_c857_4dad_ba5d_ae3696fe87cd.slice/crio-ca7f9997fe84d9d0d25d222bf9bfe094747aa1279e26a02873e3079754530f6a WatchSource:0}: Error finding container ca7f9997fe84d9d0d25d222bf9bfe094747aa1279e26a02873e3079754530f6a: Status 404 returned error can't find the container with id ca7f9997fe84d9d0d25d222bf9bfe094747aa1279e26a02873e3079754530f6a Apr 16 14:40:11.235037 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:11.235003 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m" event={"ID":"5d62a358-c857-4dad-ba5d-ae3696fe87cd","Type":"ContainerStarted","Data":"ca7f9997fe84d9d0d25d222bf9bfe094747aa1279e26a02873e3079754530f6a"} Apr 16 14:40:14.246760 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:14.246680 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m" event={"ID":"5d62a358-c857-4dad-ba5d-ae3696fe87cd","Type":"ContainerStarted","Data":"3f13b43e50e1e768f66c9c5847b860349d8e9a109a3a4d9d82d9ae488e8b60f0"} Apr 16 14:40:14.247205 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:14.246826 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m" Apr 16 14:40:14.264890 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:14.264846 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m" podStartSLOduration=1.2142648600000001 podStartE2EDuration="4.26483061s" podCreationTimestamp="2026-04-16 14:40:10 +0000 UTC" firstStartedPulling="2026-04-16 14:40:10.783751327 +0000 UTC m=+629.011480933" lastFinishedPulling="2026-04-16 14:40:13.834317073 +0000 UTC m=+632.062046683" observedRunningTime="2026-04-16 14:40:14.263495417 +0000 UTC m=+632.491225045" watchObservedRunningTime="2026-04-16 14:40:14.26483061 +0000 UTC m=+632.492560237" Apr 16 14:40:20.233566 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:20.233514 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-txt6f" Apr 16 14:40:21.440021 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:21.439986 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq"] Apr 16 14:40:21.443152 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:21.443135 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" Apr 16 14:40:21.445522 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:21.445501 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-7cjrp\"" Apr 16 14:40:21.452979 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:21.452949 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc86f\" (UniqueName: \"kubernetes.io/projected/7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09-kube-api-access-bc86f\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-4xldq\" (UID: \"7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" Apr 16 14:40:21.453090 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:21.453073 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-4xldq\" (UID: \"7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" Apr 16 14:40:21.456197 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:21.456174 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq"] Apr 16 14:40:21.553572 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:21.553513 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc86f\" (UniqueName: \"kubernetes.io/projected/7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09-kube-api-access-bc86f\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-4xldq\" (UID: \"7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" Apr 16 14:40:21.553722 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:21.553610 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-4xldq\" (UID: \"7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" Apr 16 14:40:21.553942 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:21.553924 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-4xldq\" (UID: \"7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" Apr 16 14:40:21.564506 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:21.564477 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc86f\" (UniqueName: \"kubernetes.io/projected/7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09-kube-api-access-bc86f\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-4xldq\" (UID: \"7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" Apr 16 14:40:21.760392 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:21.760297 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" Apr 16 14:40:21.879977 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:21.879931 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq"] Apr 16 14:40:21.883122 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:40:21.883087 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ea3f6c0_bb7c_4e11_98ad_aacae5ed5b09.slice/crio-a58d8026cc4965f3acc8922117e60cacd170d9a412d7594b30ef01a0d9198461 WatchSource:0}: Error finding container a58d8026cc4965f3acc8922117e60cacd170d9a412d7594b30ef01a0d9198461: Status 404 returned error can't find the container with id a58d8026cc4965f3acc8922117e60cacd170d9a412d7594b30ef01a0d9198461 Apr 16 14:40:22.271895 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:22.271860 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" event={"ID":"7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09","Type":"ContainerStarted","Data":"a58d8026cc4965f3acc8922117e60cacd170d9a412d7594b30ef01a0d9198461"} Apr 16 14:40:25.253465 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:25.253428 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gv82m" Apr 16 14:40:26.287932 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:26.287847 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" event={"ID":"7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09","Type":"ContainerStarted","Data":"493b20cc9825ad012eb95ee8f11d2d938a3f3c28084f8648a4480563755f1e0a"} Apr 16 14:40:26.288293 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:26.287949 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" Apr 16 14:40:26.313162 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:26.313115 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" podStartSLOduration=1.358295534 podStartE2EDuration="5.313103698s" podCreationTimestamp="2026-04-16 14:40:21 +0000 UTC" firstStartedPulling="2026-04-16 14:40:21.885373147 +0000 UTC m=+640.113102757" lastFinishedPulling="2026-04-16 14:40:25.840181303 +0000 UTC m=+644.067910921" observedRunningTime="2026-04-16 14:40:26.310502729 +0000 UTC m=+644.538232357" watchObservedRunningTime="2026-04-16 14:40:26.313103698 +0000 UTC m=+644.540833326" Apr 16 14:40:37.292964 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:40:37.292933 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-4xldq" Apr 16 14:41:36.171197 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.171115 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-fltkt"] Apr 16 14:41:36.173578 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.173559 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" Apr 16 14:41:36.175947 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.175925 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-tjcpj\"" Apr 16 14:41:36.184321 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.184296 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-fltkt"] Apr 16 14:41:36.316227 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.316197 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-55b68ffcbd-4d9xf"] Apr 16 14:41:36.318490 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.318468 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-55b68ffcbd-4d9xf" Apr 16 14:41:36.325585 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.325558 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5g25\" (UniqueName: \"kubernetes.io/projected/95e65639-e44a-4940-8a60-019d9e1fef84-kube-api-access-w5g25\") pod \"maas-controller-6d4c8f55f9-fltkt\" (UID: \"95e65639-e44a-4940-8a60-019d9e1fef84\") " pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" Apr 16 14:41:36.327072 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.326937 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-55b68ffcbd-4d9xf"] Apr 16 14:41:36.426429 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.426351 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhjzz\" (UniqueName: \"kubernetes.io/projected/cfc7ff7b-a43f-4622-98d5-0283dd09145e-kube-api-access-rhjzz\") pod \"maas-controller-55b68ffcbd-4d9xf\" (UID: \"cfc7ff7b-a43f-4622-98d5-0283dd09145e\") " pod="opendatahub/maas-controller-55b68ffcbd-4d9xf" Apr 16 14:41:36.426603 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.426433 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5g25\" (UniqueName: \"kubernetes.io/projected/95e65639-e44a-4940-8a60-019d9e1fef84-kube-api-access-w5g25\") pod \"maas-controller-6d4c8f55f9-fltkt\" (UID: \"95e65639-e44a-4940-8a60-019d9e1fef84\") " pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" Apr 16 14:41:36.429168 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.429137 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-55b68ffcbd-4d9xf"] Apr 16 14:41:36.429373 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:41:36.429353 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-rhjzz], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-55b68ffcbd-4d9xf" podUID="cfc7ff7b-a43f-4622-98d5-0283dd09145e" Apr 16 14:41:36.434978 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.434954 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5g25\" (UniqueName: \"kubernetes.io/projected/95e65639-e44a-4940-8a60-019d9e1fef84-kube-api-access-w5g25\") pod \"maas-controller-6d4c8f55f9-fltkt\" (UID: \"95e65639-e44a-4940-8a60-019d9e1fef84\") " pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" Apr 16 14:41:36.484816 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.484785 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" Apr 16 14:41:36.516431 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.516393 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-55b68ffcbd-4d9xf" Apr 16 14:41:36.521173 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.521152 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-55b68ffcbd-4d9xf" Apr 16 14:41:36.526822 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.526801 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhjzz\" (UniqueName: \"kubernetes.io/projected/cfc7ff7b-a43f-4622-98d5-0283dd09145e-kube-api-access-rhjzz\") pod \"maas-controller-55b68ffcbd-4d9xf\" (UID: \"cfc7ff7b-a43f-4622-98d5-0283dd09145e\") " pod="opendatahub/maas-controller-55b68ffcbd-4d9xf" Apr 16 14:41:36.536617 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.536590 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhjzz\" (UniqueName: \"kubernetes.io/projected/cfc7ff7b-a43f-4622-98d5-0283dd09145e-kube-api-access-rhjzz\") pod \"maas-controller-55b68ffcbd-4d9xf\" (UID: \"cfc7ff7b-a43f-4622-98d5-0283dd09145e\") " pod="opendatahub/maas-controller-55b68ffcbd-4d9xf" Apr 16 14:41:36.606324 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.606294 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-fltkt"] Apr 16 14:41:36.608960 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:41:36.608933 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95e65639_e44a_4940_8a60_019d9e1fef84.slice/crio-914622114e8e0d07d3cd0baea683166d0c24f8349fd26bed7c15a6b4e02f0f4e WatchSource:0}: Error finding container 914622114e8e0d07d3cd0baea683166d0c24f8349fd26bed7c15a6b4e02f0f4e: Status 404 returned error can't find the container with id 914622114e8e0d07d3cd0baea683166d0c24f8349fd26bed7c15a6b4e02f0f4e Apr 16 14:41:36.628043 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.628020 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhjzz\" (UniqueName: \"kubernetes.io/projected/cfc7ff7b-a43f-4622-98d5-0283dd09145e-kube-api-access-rhjzz\") pod \"cfc7ff7b-a43f-4622-98d5-0283dd09145e\" (UID: \"cfc7ff7b-a43f-4622-98d5-0283dd09145e\") " Apr 16 14:41:36.629950 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.629928 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc7ff7b-a43f-4622-98d5-0283dd09145e-kube-api-access-rhjzz" (OuterVolumeSpecName: "kube-api-access-rhjzz") pod "cfc7ff7b-a43f-4622-98d5-0283dd09145e" (UID: "cfc7ff7b-a43f-4622-98d5-0283dd09145e"). InnerVolumeSpecName "kube-api-access-rhjzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:41:36.729059 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:36.728974 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rhjzz\" (UniqueName: \"kubernetes.io/projected/cfc7ff7b-a43f-4622-98d5-0283dd09145e-kube-api-access-rhjzz\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 16 14:41:37.522205 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:37.522174 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-55b68ffcbd-4d9xf" Apr 16 14:41:37.522205 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:37.522177 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" event={"ID":"95e65639-e44a-4940-8a60-019d9e1fef84","Type":"ContainerStarted","Data":"914622114e8e0d07d3cd0baea683166d0c24f8349fd26bed7c15a6b4e02f0f4e"} Apr 16 14:41:37.559742 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:37.559693 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-55b68ffcbd-4d9xf"] Apr 16 14:41:37.564017 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:37.563990 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-55b68ffcbd-4d9xf"] Apr 16 14:41:38.340128 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:38.339914 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc7ff7b-a43f-4622-98d5-0283dd09145e" path="/var/lib/kubelet/pods/cfc7ff7b-a43f-4622-98d5-0283dd09145e/volumes" Apr 16 14:41:39.530535 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:39.530498 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" event={"ID":"95e65639-e44a-4940-8a60-019d9e1fef84","Type":"ContainerStarted","Data":"0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668"} Apr 16 14:41:39.530929 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:39.530602 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" Apr 16 14:41:39.550749 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:39.550696 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" podStartSLOduration=1.482060536 podStartE2EDuration="3.550678412s" podCreationTimestamp="2026-04-16 14:41:36 +0000 UTC" firstStartedPulling="2026-04-16 14:41:36.6101405 +0000 UTC m=+714.837870109" lastFinishedPulling="2026-04-16 14:41:38.678758379 +0000 UTC m=+716.906487985" observedRunningTime="2026-04-16 14:41:39.550285812 +0000 UTC m=+717.778015440" watchObservedRunningTime="2026-04-16 14:41:39.550678412 +0000 UTC m=+717.778408018" Apr 16 14:41:50.540397 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:50.540365 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" Apr 16 14:41:51.459570 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:51.459522 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-fltkt"] Apr 16 14:41:51.459784 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:51.459762 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" podUID="95e65639-e44a-4940-8a60-019d9e1fef84" containerName="manager" containerID="cri-o://0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668" gracePeriod=10 Apr 16 14:41:51.693094 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:51.693072 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" Apr 16 14:41:51.847709 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:51.847679 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5g25\" (UniqueName: \"kubernetes.io/projected/95e65639-e44a-4940-8a60-019d9e1fef84-kube-api-access-w5g25\") pod \"95e65639-e44a-4940-8a60-019d9e1fef84\" (UID: \"95e65639-e44a-4940-8a60-019d9e1fef84\") " Apr 16 14:41:51.849868 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:51.849830 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95e65639-e44a-4940-8a60-019d9e1fef84-kube-api-access-w5g25" (OuterVolumeSpecName: "kube-api-access-w5g25") pod "95e65639-e44a-4940-8a60-019d9e1fef84" (UID: "95e65639-e44a-4940-8a60-019d9e1fef84"). InnerVolumeSpecName "kube-api-access-w5g25". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:41:51.949099 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:51.949062 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w5g25\" (UniqueName: \"kubernetes.io/projected/95e65639-e44a-4940-8a60-019d9e1fef84-kube-api-access-w5g25\") on node \"ip-10-0-141-239.ec2.internal\" DevicePath \"\"" Apr 16 14:41:52.575066 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:52.575032 2563 generic.go:358] "Generic (PLEG): container finished" podID="95e65639-e44a-4940-8a60-019d9e1fef84" containerID="0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668" exitCode=0 Apr 16 14:41:52.575231 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:52.575095 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" Apr 16 14:41:52.575231 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:52.575099 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" event={"ID":"95e65639-e44a-4940-8a60-019d9e1fef84","Type":"ContainerDied","Data":"0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668"} Apr 16 14:41:52.575231 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:52.575192 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-fltkt" event={"ID":"95e65639-e44a-4940-8a60-019d9e1fef84","Type":"ContainerDied","Data":"914622114e8e0d07d3cd0baea683166d0c24f8349fd26bed7c15a6b4e02f0f4e"} Apr 16 14:41:52.575231 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:52.575206 2563 scope.go:117] "RemoveContainer" containerID="0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668" Apr 16 14:41:52.582662 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:52.582644 2563 scope.go:117] "RemoveContainer" containerID="0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668" Apr 16 14:41:52.582908 ip-10-0-141-239 kubenswrapper[2563]: E0416 14:41:52.582878 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668\": container with ID starting with 0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668 not found: ID does not exist" containerID="0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668" Apr 16 14:41:52.582957 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:52.582919 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668"} err="failed to get container status \"0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668\": rpc error: code = NotFound desc = could not find container \"0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668\": container with ID starting with 0370f5e5a538a66768fa51a9ddd4ac59f24d4011b40d52c8835daedf1f1f2668 not found: ID does not exist" Apr 16 14:41:52.594026 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:52.593996 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-fltkt"] Apr 16 14:41:52.598491 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:52.598432 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-fltkt"] Apr 16 14:41:54.339291 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:41:54.339258 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95e65639-e44a-4940-8a60-019d9e1fef84" path="/var/lib/kubelet/pods/95e65639-e44a-4940-8a60-019d9e1fef84/volumes" Apr 16 14:44:42.258289 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:44:42.258203 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:44:42.259727 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:44:42.259702 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:49:42.278795 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:49:42.278758 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:49:42.281304 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:49:42.279983 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:52:15.184882 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:15.184801 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-r6dd9_fea12b07-d666-4d0a-97dd-20214f879591/manager/0.log" Apr 16 14:52:15.520332 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:15.520253 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6f7bb56bb6-466ff_223bc827-900c-429e-a82b-27b6770f3dd4/manager/0.log" Apr 16 14:52:17.273884 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:17.273849 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-txt6f_542f11da-42dc-40c9-b9a5-5fb456f554b9/manager/0.log" Apr 16 14:52:17.382544 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:17.382504 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-gv82m_5d62a358-c857-4dad-ba5d-ae3696fe87cd/manager/0.log" Apr 16 14:52:17.731065 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:17.731035 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-4xldq_7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09/manager/0.log" Apr 16 14:52:18.422876 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:18.422843 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-42nv4_3e10e903-89e1-4380-b705-4f81f81ff239/discovery/0.log" Apr 16 14:52:18.533171 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:18.533142 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6468c4fc75-ncjnh_449b2fdb-e79e-488a-8b89-992acf1c125f/kube-auth-proxy/0.log" Apr 16 14:52:25.871019 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:25.870988 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-x5q8k_fd31aa30-3e27-4c57-ae7d-843fa27b25d3/global-pull-secret-syncer/0.log" Apr 16 14:52:25.981815 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:25.981780 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2cqcm_d2ce07ef-ba7f-4f81-a82f-80f139286fa6/konnectivity-agent/0.log" Apr 16 14:52:26.116370 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:26.116339 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-239.ec2.internal_93794f44aea94deab445c9557d98f20e/haproxy/0.log" Apr 16 14:52:30.356344 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:30.356310 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-txt6f_542f11da-42dc-40c9-b9a5-5fb456f554b9/manager/0.log" Apr 16 14:52:30.383736 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:30.383705 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-gv82m_5d62a358-c857-4dad-ba5d-ae3696fe87cd/manager/0.log" Apr 16 14:52:30.503039 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:30.503007 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-4xldq_7ea3f6c0-bb7c-4e11-98ad-aacae5ed5b09/manager/0.log" Apr 16 14:52:32.382166 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:32.382087 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hqvsk_88255980-f2f9-48e3-847a-b8daedd7edc4/node-exporter/0.log" Apr 16 14:52:32.405293 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:32.405268 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hqvsk_88255980-f2f9-48e3-847a-b8daedd7edc4/kube-rbac-proxy/0.log" Apr 16 14:52:32.430004 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:32.429981 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hqvsk_88255980-f2f9-48e3-847a-b8daedd7edc4/init-textfile/0.log" Apr 16 14:52:34.612650 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.612616 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb"] Apr 16 14:52:34.613050 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.612877 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95e65639-e44a-4940-8a60-019d9e1fef84" containerName="manager" Apr 16 14:52:34.613050 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.612889 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="95e65639-e44a-4940-8a60-019d9e1fef84" containerName="manager" Apr 16 14:52:34.613050 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.612951 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="95e65639-e44a-4940-8a60-019d9e1fef84" containerName="manager" Apr 16 14:52:34.615950 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.615929 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.618150 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.618127 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mcxmz\"/\"default-dockercfg-d2ggf\"" Apr 16 14:52:34.618271 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.618151 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mcxmz\"/\"openshift-service-ca.crt\"" Apr 16 14:52:34.619042 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.619026 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mcxmz\"/\"kube-root-ca.crt\"" Apr 16 14:52:34.626452 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.626427 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb"] Apr 16 14:52:34.759740 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.759697 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b06603d7-d73b-4458-b2ab-5dff2928caa3-podres\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.759740 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.759739 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b06603d7-d73b-4458-b2ab-5dff2928caa3-sys\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.759969 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.759759 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4rvp\" (UniqueName: \"kubernetes.io/projected/b06603d7-d73b-4458-b2ab-5dff2928caa3-kube-api-access-c4rvp\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.759969 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.759857 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b06603d7-d73b-4458-b2ab-5dff2928caa3-proc\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.759969 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.759927 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b06603d7-d73b-4458-b2ab-5dff2928caa3-lib-modules\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.861313 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.861252 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b06603d7-d73b-4458-b2ab-5dff2928caa3-lib-modules\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.861548 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.861334 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b06603d7-d73b-4458-b2ab-5dff2928caa3-podres\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.861548 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.861352 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b06603d7-d73b-4458-b2ab-5dff2928caa3-sys\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.861548 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.861371 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4rvp\" (UniqueName: \"kubernetes.io/projected/b06603d7-d73b-4458-b2ab-5dff2928caa3-kube-api-access-c4rvp\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.861548 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.861395 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b06603d7-d73b-4458-b2ab-5dff2928caa3-proc\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.861548 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.861476 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b06603d7-d73b-4458-b2ab-5dff2928caa3-proc\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.861548 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.861480 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b06603d7-d73b-4458-b2ab-5dff2928caa3-lib-modules\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.861548 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.861476 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b06603d7-d73b-4458-b2ab-5dff2928caa3-sys\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.861548 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.861488 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b06603d7-d73b-4458-b2ab-5dff2928caa3-podres\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.870208 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.870147 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4rvp\" (UniqueName: \"kubernetes.io/projected/b06603d7-d73b-4458-b2ab-5dff2928caa3-kube-api-access-c4rvp\") pod \"perf-node-gather-daemonset-pbbhb\" (UID: \"b06603d7-d73b-4458-b2ab-5dff2928caa3\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:34.927918 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:34.927881 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:35.048356 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:35.048326 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb"] Apr 16 14:52:35.052049 ip-10-0-141-239 kubenswrapper[2563]: W0416 14:52:35.052011 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb06603d7_d73b_4458_b2ab_5dff2928caa3.slice/crio-f8b57c20d5bc901f3c33ac0b7999f9d65ee9f88ac7a343d6a8c2ac1540f38642 WatchSource:0}: Error finding container f8b57c20d5bc901f3c33ac0b7999f9d65ee9f88ac7a343d6a8c2ac1540f38642: Status 404 returned error can't find the container with id f8b57c20d5bc901f3c33ac0b7999f9d65ee9f88ac7a343d6a8c2ac1540f38642 Apr 16 14:52:35.053784 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:35.053762 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:35.613221 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:35.613186 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" event={"ID":"b06603d7-d73b-4458-b2ab-5dff2928caa3","Type":"ContainerStarted","Data":"996723c7c2556c53714be1aa448637729e60e2c550b53735c21e5b0c6146c12b"} Apr 16 14:52:35.613221 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:35.613222 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" event={"ID":"b06603d7-d73b-4458-b2ab-5dff2928caa3","Type":"ContainerStarted","Data":"f8b57c20d5bc901f3c33ac0b7999f9d65ee9f88ac7a343d6a8c2ac1540f38642"} Apr 16 14:52:35.613657 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:35.613338 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:35.629586 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:35.629505 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" podStartSLOduration=1.62948878 podStartE2EDuration="1.62948878s" podCreationTimestamp="2026-04-16 14:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:35.628330591 +0000 UTC m=+1373.856060219" watchObservedRunningTime="2026-04-16 14:52:35.62948878 +0000 UTC m=+1373.857218409" Apr 16 14:52:36.905119 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:36.905087 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w2ldg_6355259e-a870-4945-9b77-524fb13888c6/dns/0.log" Apr 16 14:52:36.928824 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:36.928793 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w2ldg_6355259e-a870-4945-9b77-524fb13888c6/kube-rbac-proxy/0.log" Apr 16 14:52:36.955795 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:36.955774 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ktxmg_8c240568-6c79-4c9e-af56-ea680b0f0410/dns-node-resolver/0.log" Apr 16 14:52:37.553005 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:37.552974 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fvd5j_898da78a-88a7-4608-baa8-e5a6bdba777f/node-ca/0.log" Apr 16 14:52:38.499738 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:38.499709 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-42nv4_3e10e903-89e1-4380-b705-4f81f81ff239/discovery/0.log" Apr 16 14:52:38.521762 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:38.521733 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6468c4fc75-ncjnh_449b2fdb-e79e-488a-8b89-992acf1c125f/kube-auth-proxy/0.log" Apr 16 14:52:39.170751 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:39.170719 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2c9qn_04ae3980-8ad0-4077-87e4-d094e30cba62/serve-healthcheck-canary/0.log" Apr 16 14:52:39.721748 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:39.721720 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-45bjl_e3c4378b-8642-4e54-b206-e56cbc51fa51/kube-rbac-proxy/0.log" Apr 16 14:52:39.744267 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:39.744241 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-45bjl_e3c4378b-8642-4e54-b206-e56cbc51fa51/exporter/0.log" Apr 16 14:52:39.768370 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:39.768345 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-45bjl_e3c4378b-8642-4e54-b206-e56cbc51fa51/extractor/0.log" Apr 16 14:52:41.624840 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:41.624812 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-pbbhb" Apr 16 14:52:41.839466 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:41.839438 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-r6dd9_fea12b07-d666-4d0a-97dd-20214f879591/manager/0.log" Apr 16 14:52:42.007427 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:42.007333 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6f7bb56bb6-466ff_223bc827-900c-429e-a82b-27b6770f3dd4/manager/0.log" Apr 16 14:52:49.923104 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:49.923042 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4p4tq_e891483d-413b-4b2d-a2d0-ee6b42f8ccbf/kube-multus-additional-cni-plugins/0.log" Apr 16 14:52:49.946112 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:49.946076 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4p4tq_e891483d-413b-4b2d-a2d0-ee6b42f8ccbf/egress-router-binary-copy/0.log" Apr 16 14:52:49.970139 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:49.970114 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4p4tq_e891483d-413b-4b2d-a2d0-ee6b42f8ccbf/cni-plugins/0.log" Apr 16 14:52:49.993576 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:49.993548 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4p4tq_e891483d-413b-4b2d-a2d0-ee6b42f8ccbf/bond-cni-plugin/0.log" Apr 16 14:52:50.017334 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:50.017304 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4p4tq_e891483d-413b-4b2d-a2d0-ee6b42f8ccbf/routeoverride-cni/0.log" Apr 16 14:52:50.047085 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:50.047057 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4p4tq_e891483d-413b-4b2d-a2d0-ee6b42f8ccbf/whereabouts-cni-bincopy/0.log" Apr 16 14:52:50.070366 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:50.070342 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4p4tq_e891483d-413b-4b2d-a2d0-ee6b42f8ccbf/whereabouts-cni/0.log" Apr 16 14:52:50.546464 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:50.546430 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z5ff6_c7c31e47-1a62-42f4-b8c1-63188895e755/kube-multus/0.log" Apr 16 14:52:50.568354 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:50.568306 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9fx7w_e7706545-6db6-4426-919c-bf83b5020047/network-metrics-daemon/0.log" Apr 16 14:52:50.590402 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:50.590380 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9fx7w_e7706545-6db6-4426-919c-bf83b5020047/kube-rbac-proxy/0.log" Apr 16 14:52:51.575945 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:51.575912 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-controller/0.log" Apr 16 14:52:51.597792 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:51.597756 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/0.log" Apr 16 14:52:51.610092 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:51.610058 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/ovn-acl-logging/1.log" Apr 16 14:52:51.635838 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:51.635813 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/kube-rbac-proxy-node/0.log" Apr 16 14:52:51.661424 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:51.661399 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:52:51.679593 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:51.679568 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/northd/0.log" Apr 16 14:52:51.703610 ip-10-0-141-239 kubenswrapper[2563]: I0416 14:52:51.703586 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8wdkz_29c3ac6b-dd94-4f4b-88ca-cf83af0046d3/nbdb/0.log"