Apr 22 18:39:54.926376 ip-10-0-129-85 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:39:54.926387 ip-10-0-129-85 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:39:54.926394 ip-10-0-129-85 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:39:54.926638 ip-10-0-129-85 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:40:05.007937 ip-10-0-129-85 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:40:05.007953 ip-10-0-129-85 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 36722bf7378e4a90bba05ca9de214abb -- Apr 22 18:42:13.631107 ip-10-0-129-85 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:42:14.074819 ip-10-0-129-85 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:42:14.074819 ip-10-0-129-85 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:42:14.074819 ip-10-0-129-85 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:42:14.074819 ip-10-0-129-85 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:42:14.074819 ip-10-0-129-85 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:42:14.077864 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.077771 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:42:14.084774 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084758 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:14.084774 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084773 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084777 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084781 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084784 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084787 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084790 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084793 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084796 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084798 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084801 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084803 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084806 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084808 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084811 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084814 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084816 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084819 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084821 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084824 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084826 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:14.084842 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084839 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084842 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084845 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084849 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084853 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084856 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084859 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084861 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084864 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084867 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084869 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084871 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084875 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084878 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084880 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084883 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084885 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084888 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084891 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:14.085310 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084893 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084896 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084899 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084901 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084904 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084906 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084908 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084911 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084913 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084916 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084918 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084921 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084923 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084925 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084928 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084932 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084935 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084937 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084939 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084942 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:14.085812 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084944 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084947 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084949 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084952 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084954 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084957 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084959 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084961 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084964 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084966 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084969 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084971 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084973 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084976 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084978 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084980 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084983 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084985 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084989 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084993 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:14.086294 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084996 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.084999 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085002 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085004 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085007 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085009 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085404 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085409 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085412 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085414 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085417 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085420 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085422 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085425 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085427 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085430 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085432 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085435 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085437 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:14.086780 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085440 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085442 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085445 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085449 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085452 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085455 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085458 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085461 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085464 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085466 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085469 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085471 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085474 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085476 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085479 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085481 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085484 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085486 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085488 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085491 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:14.087440 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085495 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085498 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085500 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085503 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085505 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085508 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085511 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085513 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085515 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085518 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085520 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085523 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085525 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085529 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085531 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085534 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085536 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085539 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085541 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085543 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:14.088115 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085545 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085548 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085550 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085552 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085556 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085558 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085560 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085563 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085565 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085568 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085570 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085573 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085576 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085580 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085582 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085584 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085587 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085589 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085591 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085594 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:14.088630 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085596 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085599 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085601 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085603 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085606 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085608 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085610 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085613 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085615 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085618 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085620 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085624 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.085627 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086282 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086292 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086299 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086304 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086308 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086311 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086316 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086320 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:42:14.089162 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086323 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086326 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086333 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086337 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086340 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086343 2578 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086346 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086349 2578 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086352 2578 flags.go:64] FLAG: --cloud-config="" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086355 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086357 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086361 2578 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086364 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086367 2578 flags.go:64] FLAG: --config-dir="" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086370 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086373 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086377 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086380 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086383 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086386 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086389 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086392 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086395 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086398 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086401 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:42:14.089693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086405 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086408 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086411 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086414 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086417 2578 flags.go:64] FLAG: --enable-server="true" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086419 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086424 2578 flags.go:64] FLAG: --event-burst="100" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086427 2578 flags.go:64] FLAG: --event-qps="50" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086430 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086434 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086437 2578 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086441 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086444 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086447 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086450 2578 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086453 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086456 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086459 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086462 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086465 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086468 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086470 2578 flags.go:64] FLAG: --feature-gates="" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086474 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086477 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086480 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:42:14.090288 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086483 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086488 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086491 2578 flags.go:64] FLAG: --help="false" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086494 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086497 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086500 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086503 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086506 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086509 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086512 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086515 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086517 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086520 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086523 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086526 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086529 2578 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086533 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086536 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086539 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086542 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086545 2578 flags.go:64] FLAG: --lock-file="" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086548 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086551 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086554 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:42:14.090942 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086559 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086561 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086564 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086567 2578 flags.go:64] FLAG: --logging-format="text" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086569 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086573 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086576 2578 flags.go:64] FLAG: --manifest-url="" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086578 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086583 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086587 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086591 2578 flags.go:64] FLAG: --max-pods="110" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086594 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086596 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086599 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086602 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086605 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086608 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086611 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086618 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086621 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086624 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086627 2578 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086630 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:42:14.091520 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086650 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086653 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086657 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086661 2578 flags.go:64] FLAG: --port="10250" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086664 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086667 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0382f830fb08e1e6c" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086670 2578 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086674 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086677 2578 flags.go:64] FLAG: --register-node="true" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086680 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086682 2578 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086687 2578 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086690 2578 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086692 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086695 2578 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086699 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086701 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086704 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086708 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086711 2578 flags.go:64] FLAG: --runonce="false" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086714 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086717 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086720 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086723 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086726 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086729 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:42:14.092215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086732 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086735 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086738 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086740 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086743 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086746 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086749 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086752 2578 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086756 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086761 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086764 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086767 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086770 2578 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086773 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086776 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086779 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086782 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086785 2578 flags.go:64] FLAG: --v="2" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086790 2578 flags.go:64] FLAG: --version="false" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086793 2578 flags.go:64] FLAG: --vmodule="" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086798 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.086801 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087282 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087295 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:14.092845 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087301 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087306 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087310 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087314 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087318 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087323 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087327 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087331 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087335 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087339 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087343 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087353 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087359 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087365 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087370 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087375 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087383 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087387 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087392 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087397 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:14.093412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087402 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087407 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087411 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087421 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087425 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087430 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087434 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087438 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087443 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087447 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087451 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087456 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087461 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087465 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087469 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087474 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087484 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087488 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087493 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087497 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:14.093966 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087502 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087506 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087511 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087515 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087519 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087524 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087528 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087532 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087542 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087546 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087550 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087555 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087559 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087563 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087567 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087571 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087576 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087581 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087585 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087589 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:14.094449 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087598 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087602 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087606 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087610 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087615 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087620 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087623 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087627 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087632 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087649 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087652 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087655 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087659 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087696 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087716 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087722 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087728 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087733 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087739 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:14.094951 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087744 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087749 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087754 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087758 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.087765 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.088424 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.094854 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.094870 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094920 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094925 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094929 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094931 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094935 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094938 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094941 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094944 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:14.095412 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094946 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094949 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094951 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094954 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094957 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094959 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094962 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094965 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094968 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094970 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094973 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094976 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094978 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094981 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094984 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094986 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094989 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094991 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094994 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:14.095889 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094997 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.094999 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095002 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095005 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095007 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095010 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095013 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095016 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095019 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095022 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095024 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095027 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095030 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095032 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095035 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095037 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095039 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095043 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095047 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:14.096337 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095051 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095054 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095057 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095060 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095062 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095065 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095067 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095070 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095073 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095075 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095078 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095081 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095083 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095086 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095088 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095090 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095093 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095096 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095099 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095103 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:14.096806 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095107 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095110 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095112 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095114 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095117 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095119 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095122 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095124 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095127 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095130 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095132 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095135 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095137 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095141 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095143 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095146 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095148 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095150 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095153 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:14.097284 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095155 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.095160 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095272 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095276 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095279 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095282 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095284 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095287 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095290 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095292 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095295 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095298 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095300 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095303 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095305 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:14.097781 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095308 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095311 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095314 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095318 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095321 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095323 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095325 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095328 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095331 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095334 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095336 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095338 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095342 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095344 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095347 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095349 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095352 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095354 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095356 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:14.098136 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095359 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095361 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095364 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095366 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095368 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095371 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095373 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095376 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095378 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095381 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095384 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095388 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095392 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095394 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095397 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095400 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095403 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095405 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095408 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095410 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:14.098655 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095413 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095415 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095418 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095421 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095424 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095426 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095429 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095431 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095434 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095437 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095439 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095442 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095444 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095446 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095449 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095451 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095453 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095456 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095459 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095461 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:14.099145 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095464 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095466 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095469 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095472 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095475 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095477 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095480 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095482 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095485 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095487 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095489 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095491 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095494 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:14.095496 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.095500 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:42:14.099736 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.096261 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:42:14.100129 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.098185 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:42:14.100129 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.099181 2578 server.go:1019] "Starting client certificate rotation" Apr 22 18:42:14.100129 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.099297 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:42:14.100129 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.099350 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:42:14.124197 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.124177 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:42:14.128447 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.128421 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:42:14.143317 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.143298 2578 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:42:14.149685 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.149664 2578 log.go:25] "Validated CRI v1 image API" Apr 22 18:42:14.151353 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.151336 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:42:14.157226 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.157208 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:42:14.157386 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.157367 2578 fs.go:135] Filesystem UUIDs: map[50872189-fedc-48bf-a24f-6d486c3f882c:/dev/nvme0n1p3 6c3b2826-6efa-4351-b036-025be014af0e:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 18:42:14.157430 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.157388 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:42:14.162915 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.162804 2578 manager.go:217] Machine: {Timestamp:2026-04-22 18:42:14.161053714 +0000 UTC m=+0.403394103 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098884 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec232fded17d1ac54c6e22db4bbf22cc SystemUUID:ec232fde-d17d-1ac5-4c6e-22db4bbf22cc BootID:36722bf7-378e-4a90-bba0-5ca9de214abb Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:78:e1:3d:d4:63 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:78:e1:3d:d4:63 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:96:eb:27:01:68:27 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:42:14.162915 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.162906 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:42:14.163046 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.162986 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:42:14.166138 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.166113 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:42:14.166275 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.166140 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-85.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:42:14.166831 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.166821 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:42:14.166862 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.166834 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:42:14.166862 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.166847 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:42:14.166862 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.166857 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:42:14.167954 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.167943 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:42:14.168095 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.168086 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:42:14.170052 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.170043 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:42:14.170089 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.170061 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:42:14.170089 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.170072 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:42:14.170089 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.170082 2578 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:42:14.170208 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.170091 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:42:14.171049 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.171037 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:42:14.171093 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.171056 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:42:14.173519 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.173502 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:42:14.174712 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.174695 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:42:14.175968 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.175954 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:42:14.176112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.175976 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:42:14.176112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.175986 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:42:14.176112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.175994 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:42:14.176112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.176002 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:42:14.176112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.176011 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:42:14.176112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.176020 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:42:14.176112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.176028 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:42:14.176112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.176039 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:42:14.176112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.176048 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:42:14.176112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.176059 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:42:14.176112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.176074 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:42:14.177602 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.177590 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:42:14.177686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.177605 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:42:14.180971 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.180956 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:42:14.181067 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.180998 2578 server.go:1295] "Started kubelet" Apr 22 18:42:14.181118 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.181065 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:42:14.181701 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.181620 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:42:14.181771 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.181752 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:42:14.181764 ip-10-0-129-85 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:42:14.182748 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.182725 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:42:14.183614 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.183601 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:42:14.186291 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.186267 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-85.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:42:14.186579 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.186552 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:42:14.187271 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.187240 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-85.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:42:14.189415 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.188603 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-85.ec2.internal.18a8c1fa652b1985 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-85.ec2.internal,UID:ip-10-0-129-85.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-85.ec2.internal,},FirstTimestamp:2026-04-22 18:42:14.180968837 +0000 UTC m=+0.423309229,LastTimestamp:2026-04-22 18:42:14.180968837 +0000 UTC m=+0.423309229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-85.ec2.internal,}" Apr 22 18:42:14.189695 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.189674 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:42:14.189796 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.189696 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:42:14.190260 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.190241 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:42:14.190260 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.190261 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:42:14.190382 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.190269 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:42:14.190382 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.190314 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:42:14.190382 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.190322 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:42:14.190790 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.190760 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:14.191085 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.191070 2578 factory.go:153] Registering CRI-O factory Apr 22 18:42:14.191177 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.191089 2578 factory.go:223] Registration of the crio container factory successfully Apr 22 18:42:14.191177 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.191135 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:42:14.191177 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.191145 2578 factory.go:55] Registering systemd factory Apr 22 18:42:14.191177 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.191153 2578 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:42:14.191177 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.191174 2578 factory.go:103] Registering Raw factory Apr 22 18:42:14.191363 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.191187 2578 manager.go:1196] Started watching for new ooms in manager Apr 22 18:42:14.192200 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.191774 2578 manager.go:319] Starting recovery of all containers Apr 22 18:42:14.193803 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.193769 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:42:14.198780 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.198730 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:42:14.200965 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.200940 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:42:14.201087 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.201063 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-85.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:42:14.204109 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.203996 2578 manager.go:324] Recovery completed Apr 22 18:42:14.205397 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.205370 2578 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 18:42:14.208730 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.208714 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:14.212931 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.212914 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:14.213003 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.212944 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:14.213003 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.212956 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:14.213363 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.213349 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:42:14.213363 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.213361 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:42:14.213443 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.213376 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:42:14.215044 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.214956 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-85.ec2.internal.18a8c1fa6712cf0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-85.ec2.internal,UID:ip-10-0-129-85.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-85.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-85.ec2.internal,},FirstTimestamp:2026-04-22 18:42:14.212931343 +0000 UTC m=+0.455271733,LastTimestamp:2026-04-22 18:42:14.212931343 +0000 UTC m=+0.455271733,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-85.ec2.internal,}" Apr 22 18:42:14.215466 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.215448 2578 policy_none.go:49] "None policy: Start" Apr 22 18:42:14.215519 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.215471 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:42:14.215519 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.215481 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:42:14.224063 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.223994 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-85.ec2.internal.18a8c1fa671312de default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-85.ec2.internal,UID:ip-10-0-129-85.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-129-85.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-129-85.ec2.internal,},FirstTimestamp:2026-04-22 18:42:14.212948702 +0000 UTC m=+0.455289092,LastTimestamp:2026-04-22 18:42:14.212948702 +0000 UTC m=+0.455289092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-85.ec2.internal,}" Apr 22 18:42:14.234444 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.234380 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-85.ec2.internal.18a8c1fa671343ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-85.ec2.internal,UID:ip-10-0-129-85.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-129-85.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-129-85.ec2.internal,},FirstTimestamp:2026-04-22 18:42:14.212961262 +0000 UTC m=+0.455301654,LastTimestamp:2026-04-22 18:42:14.212961262 +0000 UTC m=+0.455301654,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-85.ec2.internal,}" Apr 22 18:42:14.258259 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.258225 2578 manager.go:341] "Starting Device Plugin manager" Apr 22 18:42:14.258259 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.258255 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:42:14.258259 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.258266 2578 server.go:85] "Starting device plugin registration server" Apr 22 18:42:14.261373 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.258526 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:42:14.261373 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.258542 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:42:14.261373 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.258629 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:42:14.261373 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.258733 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:42:14.261373 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.258742 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:42:14.261373 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.259580 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:42:14.261373 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.259691 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:14.270569 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.270505 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-85.ec2.internal.18a8c1fa69fcb18f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-85.ec2.internal,UID:ip-10-0-129-85.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-129-85.ec2.internal,},FirstTimestamp:2026-04-22 18:42:14.261813647 +0000 UTC m=+0.504154040,LastTimestamp:2026-04-22 18:42:14.261813647 +0000 UTC m=+0.504154040,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-85.ec2.internal,}" Apr 22 18:42:14.290785 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.290760 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:42:14.293249 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.293237 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:42:14.293319 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.293263 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:42:14.293319 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.293274 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:42:14.293319 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.293306 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:42:14.313069 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.313040 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 22 18:42:14.326304 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.326261 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zskcf" Apr 22 18:42:14.335040 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.335022 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zskcf" Apr 22 18:42:14.359337 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.359307 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:14.360125 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.360110 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:14.360184 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.360139 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:14.360184 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.360149 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:14.360184 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.360170 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.378254 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.378235 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.378321 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.378258 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-85.ec2.internal\": node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:14.393370 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.393347 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-85.ec2.internal"] Apr 22 18:42:14.393426 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.393411 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:14.394625 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.394608 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:14.394711 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.394656 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:14.394711 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.394667 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:14.395696 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.395684 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:14.395773 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.395760 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.395806 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.395796 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:14.396374 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.396354 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:14.396460 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.396384 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:14.396460 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.396394 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:14.396460 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.396360 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:14.396460 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.396455 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:14.396604 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.396471 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:14.397044 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.397023 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:14.397535 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.397522 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.397581 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.397547 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:14.398221 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.398195 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:14.398277 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.398233 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:14.398277 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.398242 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:14.428181 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.428158 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-85.ec2.internal\" not found" node="ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.432544 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.432526 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-85.ec2.internal\" not found" node="ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.491721 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.491697 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/79daf747565244a11b1a61e38cc6d0df-config\") pod \"kube-apiserver-proxy-ip-10-0-129-85.ec2.internal\" (UID: \"79daf747565244a11b1a61e38cc6d0df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.491721 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.491723 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c535ba14e14c867f6ac9b6b8e0cc4308-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal\" (UID: \"c535ba14e14c867f6ac9b6b8e0cc4308\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.491862 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.491740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c535ba14e14c867f6ac9b6b8e0cc4308-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal\" (UID: \"c535ba14e14c867f6ac9b6b8e0cc4308\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.497786 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.497771 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:14.592509 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.592408 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c535ba14e14c867f6ac9b6b8e0cc4308-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal\" (UID: \"c535ba14e14c867f6ac9b6b8e0cc4308\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.592509 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.592453 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c535ba14e14c867f6ac9b6b8e0cc4308-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal\" (UID: \"c535ba14e14c867f6ac9b6b8e0cc4308\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.592509 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.592470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/79daf747565244a11b1a61e38cc6d0df-config\") pod \"kube-apiserver-proxy-ip-10-0-129-85.ec2.internal\" (UID: \"79daf747565244a11b1a61e38cc6d0df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.592509 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.592516 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/79daf747565244a11b1a61e38cc6d0df-config\") pod \"kube-apiserver-proxy-ip-10-0-129-85.ec2.internal\" (UID: \"79daf747565244a11b1a61e38cc6d0df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.592775 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.592524 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c535ba14e14c867f6ac9b6b8e0cc4308-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal\" (UID: \"c535ba14e14c867f6ac9b6b8e0cc4308\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.592775 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.592529 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c535ba14e14c867f6ac9b6b8e0cc4308-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal\" (UID: \"c535ba14e14c867f6ac9b6b8e0cc4308\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.598508 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.598490 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:14.699270 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.699237 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:14.731409 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.731391 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.734803 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:14.734778 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-85.ec2.internal" Apr 22 18:42:14.799658 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.799623 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:14.900483 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:14.900407 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:15.000901 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:15.000870 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:15.100409 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.100370 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:42:15.101458 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:15.101432 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:15.190737 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.190714 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:42:15.201704 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:15.201626 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:15.208426 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.208408 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:42:15.232664 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.232623 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-sswm2" Apr 22 18:42:15.233383 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.233366 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:15.241602 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.241578 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-sswm2" Apr 22 18:42:15.302545 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:15.302519 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-85.ec2.internal\" not found" Apr 22 18:42:15.319064 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:15.319020 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79daf747565244a11b1a61e38cc6d0df.slice/crio-05442d2f095c5ec1d78cb32acbc2351a1c69bd0c360c4412280878cd9393f69d WatchSource:0}: Error finding container 05442d2f095c5ec1d78cb32acbc2351a1c69bd0c360c4412280878cd9393f69d: Status 404 returned error can't find the container with id 05442d2f095c5ec1d78cb32acbc2351a1c69bd0c360c4412280878cd9393f69d Apr 22 18:42:15.319258 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:15.319241 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc535ba14e14c867f6ac9b6b8e0cc4308.slice/crio-913cb7a8ff2b5d8cc9d0315c1403e46d787039d8bb890c64aaca3abb33ebe34b WatchSource:0}: Error finding container 913cb7a8ff2b5d8cc9d0315c1403e46d787039d8bb890c64aaca3abb33ebe34b: Status 404 returned error can't find the container with id 913cb7a8ff2b5d8cc9d0315c1403e46d787039d8bb890c64aaca3abb33ebe34b Apr 22 18:42:15.323206 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.323191 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:42:15.336839 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.336805 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:37:14 +0000 UTC" deadline="2028-01-11 15:03:52.264449892 +0000 UTC" Apr 22 18:42:15.336839 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.336833 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15092h21m36.927620019s" Apr 22 18:42:15.341487 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.341469 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:15.381185 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.381164 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:15.390489 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.390472 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" Apr 22 18:42:15.406287 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.406264 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:42:15.407105 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.407091 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-85.ec2.internal" Apr 22 18:42:15.417056 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.417036 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:42:15.679971 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:15.679943 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:16.171175 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.171108 2578 apiserver.go:52] "Watching apiserver" Apr 22 18:42:16.180034 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.180009 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:42:16.180981 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.180945 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-85.ec2.internal","openshift-multus/network-metrics-daemon-l6jrz","openshift-network-diagnostics/network-check-target-mbl6x","openshift-network-operator/iptables-alerter-4cgd8","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5","openshift-cluster-node-tuning-operator/tuned-4l828","openshift-image-registry/node-ca-67qqz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal","openshift-multus/multus-additional-cni-plugins-c4hqs","openshift-multus/multus-x5q6c","openshift-ovn-kubernetes/ovnkube-node-sr4r9","kube-system/konnectivity-agent-7n89x"] Apr 22 18:42:16.185260 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.185236 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:16.185360 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.185336 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:16.187032 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.187013 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:16.187111 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.187069 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:16.187152 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.187102 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4cgd8" Apr 22 18:42:16.189116 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.189094 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.189763 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.189735 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:42:16.189939 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.189918 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:42:16.189996 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.189961 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:42:16.190061 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.189923 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6725v\"" Apr 22 18:42:16.191224 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.191202 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.191802 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.191783 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:42:16.191894 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.191784 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:42:16.191953 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.191926 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:42:16.192284 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.192270 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xnt57\"" Apr 22 18:42:16.193327 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.193291 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-67qqz" Apr 22 18:42:16.193720 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.193704 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:42:16.193809 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.193791 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-4xfr4\"" Apr 22 18:42:16.194129 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.194111 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:42:16.195675 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.195484 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.195765 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.195748 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-75794\"" Apr 22 18:42:16.196386 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.196365 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:42:16.196386 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.196378 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:42:16.196573 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.196559 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:42:16.197968 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.197950 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:42:16.198057 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.197990 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nd27q\"" Apr 22 18:42:16.198119 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.198066 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:42:16.198119 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.198111 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:42:16.198409 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.198395 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:42:16.199134 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.199112 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:42:16.199346 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.199333 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.200959 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.200941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-sysconfig\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.201050 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.200971 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-var-lib-kubelet\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.201050 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.200999 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9d4cda14-9eb0-451a-95dd-098981e2a91b-iptables-alerter-script\") pod \"iptables-alerter-4cgd8\" (UID: \"9d4cda14-9eb0-451a-95dd-098981e2a91b\") " pod="openshift-network-operator/iptables-alerter-4cgd8" Apr 22 18:42:16.201050 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201026 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jhwb\" (UniqueName: \"kubernetes.io/projected/9d4cda14-9eb0-451a-95dd-098981e2a91b-kube-api-access-8jhwb\") pod \"iptables-alerter-4cgd8\" (UID: \"9d4cda14-9eb0-451a-95dd-098981e2a91b\") " pod="openshift-network-operator/iptables-alerter-4cgd8" Apr 22 18:42:16.201172 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201069 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-registration-dir\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.201172 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201125 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-run\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.201172 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201164 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-lib-modules\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.201279 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201209 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpdqf\" (UniqueName: \"kubernetes.io/projected/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-kube-api-access-rpdqf\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.201279 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201231 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-kubernetes\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.201279 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-host\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.201279 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201260 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-tuned\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.201279 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201275 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/644b7849-60c1-43db-80c0-b41ef5e73b3f-tmp\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.201468 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201291 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:16.201468 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201321 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.201468 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-etc-selinux\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.201468 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201376 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-sys\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.201468 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201397 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-socket-dir\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.201468 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201412 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-systemd\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.201761 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201469 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lrxq\" (UniqueName: \"kubernetes.io/projected/644b7849-60c1-43db-80c0-b41ef5e73b3f-kube-api-access-9lrxq\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.201761 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201487 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdt8v\" (UniqueName: \"kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v\") pod \"network-check-target-mbl6x\" (UID: \"4790d6e1-aad7-43f0-95f6-04dde16468f8\") " pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:16.201761 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201500 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-sysctl-d\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.201761 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201524 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r57z\" (UniqueName: \"kubernetes.io/projected/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-kube-api-access-6r57z\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:16.201761 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201559 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d4cda14-9eb0-451a-95dd-098981e2a91b-host-slash\") pod \"iptables-alerter-4cgd8\" (UID: \"9d4cda14-9eb0-451a-95dd-098981e2a91b\") " pod="openshift-network-operator/iptables-alerter-4cgd8" Apr 22 18:42:16.201761 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201603 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:42:16.201761 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201611 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g422j\"" Apr 22 18:42:16.201761 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201656 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-device-dir\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.201761 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-sys-fs\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.201761 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201707 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-modprobe-d\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.202070 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201775 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-sysctl-conf\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.202070 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.201888 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.204357 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.204335 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:42:16.204432 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.204383 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:42:16.205151 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.205134 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7n89x" Apr 22 18:42:16.206017 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.205847 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:42:16.206017 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.205860 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:42:16.206017 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.205926 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:42:16.206223 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.206203 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:42:16.206326 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.206218 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dqsnv\"" Apr 22 18:42:16.207691 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.207673 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:42:16.208368 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.208350 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zqm56\"" Apr 22 18:42:16.208698 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.208682 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:42:16.242932 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.242898 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:37:15 +0000 UTC" deadline="2028-01-25 11:00:15.359175668 +0000 UTC" Apr 22 18:42:16.243122 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.243104 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15424h17m59.116077494s" Apr 22 18:42:16.291306 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.291271 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:42:16.297883 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.297820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" event={"ID":"c535ba14e14c867f6ac9b6b8e0cc4308","Type":"ContainerStarted","Data":"913cb7a8ff2b5d8cc9d0315c1403e46d787039d8bb890c64aaca3abb33ebe34b"} Apr 22 18:42:16.299065 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.299040 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-85.ec2.internal" event={"ID":"79daf747565244a11b1a61e38cc6d0df","Type":"ContainerStarted","Data":"05442d2f095c5ec1d78cb32acbc2351a1c69bd0c360c4412280878cd9393f69d"} Apr 22 18:42:16.302310 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302288 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lrxq\" (UniqueName: \"kubernetes.io/projected/644b7849-60c1-43db-80c0-b41ef5e73b3f-kube-api-access-9lrxq\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.302414 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302329 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-slash\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.302414 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302356 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-run-k8s-cni-cncf-io\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.302414 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302382 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-etc-kubernetes\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.302414 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302399 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-system-cni-dir\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.302414 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302413 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-var-lib-openvswitch\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.302633 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a0c68c2-fd6c-49b2-bf04-84096034153e-ovn-node-metrics-cert\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.302633 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302461 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-device-dir\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.302633 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-sys-fs\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.302633 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302500 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-node-log\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.302633 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8-host\") pod \"node-ca-67qqz\" (UID: \"6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8\") " pod="openshift-image-registry/node-ca-67qqz" Apr 22 18:42:16.302633 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302535 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.302633 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-sysctl-conf\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.302633 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302575 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-systemd-units\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.302633 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302598 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-cni-netd\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.302633 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302625 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302654 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsdrf\" (UniqueName: \"kubernetes.io/projected/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-kube-api-access-qsdrf\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302675 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-cnibin\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302695 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-run-systemd\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302719 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-cnibin\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302743 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-multus-cni-dir\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302757 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c557042d-06c7-4315-8e35-0884cd906ef9-cni-binary-copy\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302781 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-run-multus-certs\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9d4cda14-9eb0-451a-95dd-098981e2a91b-iptables-alerter-script\") pod \"iptables-alerter-4cgd8\" (UID: \"9d4cda14-9eb0-451a-95dd-098981e2a91b\") " pod="openshift-network-operator/iptables-alerter-4cgd8" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-registration-dir\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302845 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-lib-modules\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt5d7\" (UniqueName: \"kubernetes.io/projected/2a0c68c2-fd6c-49b2-bf04-84096034153e-kube-api-access-vt5d7\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302901 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302916 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3f34c1a3-3713-45c0-b770-cec5862c620d-agent-certs\") pod \"konnectivity-agent-7n89x\" (UID: \"3f34c1a3-3713-45c0-b770-cec5862c620d\") " pod="kube-system/konnectivity-agent-7n89x" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpdqf\" (UniqueName: \"kubernetes.io/projected/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-kube-api-access-rpdqf\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.302997 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.302976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-kubernetes\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-tuned\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303021 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/644b7849-60c1-43db-80c0-b41ef5e73b3f-tmp\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303041 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-cni-bin\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303061 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r84b5\" (UniqueName: \"kubernetes.io/projected/c557042d-06c7-4315-8e35-0884cd906ef9-kube-api-access-r84b5\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-sys\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303101 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-run-openvswitch\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-run-ovn\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303138 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-os-release\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-systemd\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-kubelet\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303202 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-etc-openvswitch\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-var-lib-cni-multus\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdt8v\" (UniqueName: \"kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v\") pod \"network-check-target-mbl6x\" (UID: \"4790d6e1-aad7-43f0-95f6-04dde16468f8\") " pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-sysctl-d\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303281 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-run-netns\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303309 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-run-ovn-kubernetes\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.303434 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303342 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a0c68c2-fd6c-49b2-bf04-84096034153e-ovnkube-config\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8-serviceca\") pod \"node-ca-67qqz\" (UID: \"6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8\") " pod="openshift-image-registry/node-ca-67qqz" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303381 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-var-lib-cni-bin\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303397 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6r57z\" (UniqueName: \"kubernetes.io/projected/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-kube-api-access-6r57z\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303423 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d4cda14-9eb0-451a-95dd-098981e2a91b-host-slash\") pod \"iptables-alerter-4cgd8\" (UID: \"9d4cda14-9eb0-451a-95dd-098981e2a91b\") " pod="openshift-network-operator/iptables-alerter-4cgd8" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-modprobe-d\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303470 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2a0c68c2-fd6c-49b2-bf04-84096034153e-ovnkube-script-lib\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303495 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3f34c1a3-3713-45c0-b770-cec5862c620d-konnectivity-ca\") pod \"konnectivity-agent-7n89x\" (UID: \"3f34c1a3-3713-45c0-b770-cec5862c620d\") " pod="kube-system/konnectivity-agent-7n89x" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303521 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-hostroot\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-sysconfig\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-var-lib-kubelet\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303623 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-run-netns\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303663 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhwb\" (UniqueName: \"kubernetes.io/projected/9d4cda14-9eb0-451a-95dd-098981e2a91b-kube-api-access-8jhwb\") pod \"iptables-alerter-4cgd8\" (UID: \"9d4cda14-9eb0-451a-95dd-098981e2a91b\") " pod="openshift-network-operator/iptables-alerter-4cgd8" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-run\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303712 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a0c68c2-fd6c-49b2-bf04-84096034153e-env-overrides\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c557042d-06c7-4315-8e35-0884cd906ef9-multus-daemon-config\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.304102 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303760 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-host\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303786 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-log-socket\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303808 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktbnh\" (UniqueName: \"kubernetes.io/projected/6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8-kube-api-access-ktbnh\") pod \"node-ca-67qqz\" (UID: \"6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8\") " pod="openshift-image-registry/node-ca-67qqz" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303837 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-cni-binary-copy\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303871 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-os-release\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303887 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-multus-socket-dir-parent\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303903 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-var-lib-kubelet\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303946 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.303976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-etc-selinux\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.304011 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-system-cni-dir\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.304039 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-multus-conf-dir\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.304069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-socket-dir\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.304210 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-socket-dir\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.304251 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-device-dir\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.304283 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-kubernetes\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.304328 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-sys-fs\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.304719 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.304464 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-sysctl-conf\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.305404 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.304519 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:42:16.305404 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.304977 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-host\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.305404 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.305047 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-registration-dir\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.305404 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.305399 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-run\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.305539 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.305410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-var-lib-kubelet\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.305539 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.305456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-systemd\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.305539 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.305499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-sysconfig\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.305539 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.305502 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d4cda14-9eb0-451a-95dd-098981e2a91b-host-slash\") pod \"iptables-alerter-4cgd8\" (UID: \"9d4cda14-9eb0-451a-95dd-098981e2a91b\") " pod="openshift-network-operator/iptables-alerter-4cgd8" Apr 22 18:42:16.305701 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.305544 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-lib-modules\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.305701 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.305609 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-etc-selinux\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.305767 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.305705 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:16.305767 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.305704 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-modprobe-d\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.305767 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.305725 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9d4cda14-9eb0-451a-95dd-098981e2a91b-iptables-alerter-script\") pod \"iptables-alerter-4cgd8\" (UID: \"9d4cda14-9eb0-451a-95dd-098981e2a91b\") " pod="openshift-network-operator/iptables-alerter-4cgd8" Apr 22 18:42:16.305894 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.305828 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-sysctl-d\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.306016 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.306002 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs podName:eaf5856d-69b7-4e87-b07c-f8ea8eed1048 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:16.805956106 +0000 UTC m=+3.048296484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs") pod "network-metrics-daemon-l6jrz" (UID: "eaf5856d-69b7-4e87-b07c-f8ea8eed1048") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:16.306100 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.306030 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/644b7849-60c1-43db-80c0-b41ef5e73b3f-sys\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.306156 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.306115 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.308773 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.308752 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/644b7849-60c1-43db-80c0-b41ef5e73b3f-etc-tuned\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.308879 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.308788 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/644b7849-60c1-43db-80c0-b41ef5e73b3f-tmp\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.330056 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.330029 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:16.330056 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.330055 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:16.330220 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.330065 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jdt8v for pod openshift-network-diagnostics/network-check-target-mbl6x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:16.330220 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.330112 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v podName:4790d6e1-aad7-43f0-95f6-04dde16468f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:16.830098182 +0000 UTC m=+3.072438559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jdt8v" (UniqueName: "kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v") pod "network-check-target-mbl6x" (UID: "4790d6e1-aad7-43f0-95f6-04dde16468f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:16.334149 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.334125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r57z\" (UniqueName: \"kubernetes.io/projected/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-kube-api-access-6r57z\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:16.334557 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.334536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jhwb\" (UniqueName: \"kubernetes.io/projected/9d4cda14-9eb0-451a-95dd-098981e2a91b-kube-api-access-8jhwb\") pod \"iptables-alerter-4cgd8\" (UID: \"9d4cda14-9eb0-451a-95dd-098981e2a91b\") " pod="openshift-network-operator/iptables-alerter-4cgd8" Apr 22 18:42:16.334682 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.334576 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lrxq\" (UniqueName: \"kubernetes.io/projected/644b7849-60c1-43db-80c0-b41ef5e73b3f-kube-api-access-9lrxq\") pod \"tuned-4l828\" (UID: \"644b7849-60c1-43db-80c0-b41ef5e73b3f\") " pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.335398 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.335377 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpdqf\" (UniqueName: \"kubernetes.io/projected/743a3f4b-8a0a-4113-a780-f4f12d4c5db5-kube-api-access-rpdqf\") pod \"aws-ebs-csi-driver-node-8kkh5\" (UID: \"743a3f4b-8a0a-4113-a780-f4f12d4c5db5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.404947 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.404912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-kubelet\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.404947 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.404947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-etc-openvswitch\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405145 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.404962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-var-lib-cni-multus\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405145 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405018 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-run-netns\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405145 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-etc-openvswitch\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405145 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405061 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-run-ovn-kubernetes\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405145 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405100 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-run-netns\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405145 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-kubelet\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405145 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-run-ovn-kubernetes\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405145 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405134 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-var-lib-cni-multus\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405438 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405158 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a0c68c2-fd6c-49b2-bf04-84096034153e-ovnkube-config\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405438 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405186 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8-serviceca\") pod \"node-ca-67qqz\" (UID: \"6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8\") " pod="openshift-image-registry/node-ca-67qqz" Apr 22 18:42:16.405438 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405209 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-var-lib-cni-bin\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405438 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405252 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2a0c68c2-fd6c-49b2-bf04-84096034153e-ovnkube-script-lib\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405438 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405280 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3f34c1a3-3713-45c0-b770-cec5862c620d-konnectivity-ca\") pod \"konnectivity-agent-7n89x\" (UID: \"3f34c1a3-3713-45c0-b770-cec5862c620d\") " pod="kube-system/konnectivity-agent-7n89x" Apr 22 18:42:16.405438 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.405438 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-hostroot\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405438 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405356 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-run-netns\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405438 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a0c68c2-fd6c-49b2-bf04-84096034153e-env-overrides\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405438 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405404 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c557042d-06c7-4315-8e35-0884cd906ef9-multus-daemon-config\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405438 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405425 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-log-socket\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405438 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405441 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktbnh\" (UniqueName: \"kubernetes.io/projected/6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8-kube-api-access-ktbnh\") pod \"node-ca-67qqz\" (UID: \"6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8\") " pod="openshift-image-registry/node-ca-67qqz" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405468 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-cni-binary-copy\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-os-release\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-multus-socket-dir-parent\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405523 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-var-lib-kubelet\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405550 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-system-cni-dir\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-multus-conf-dir\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405582 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-slash\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-run-k8s-cni-cncf-io\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-etc-kubernetes\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-system-cni-dir\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8-serviceca\") pod \"node-ca-67qqz\" (UID: \"6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8\") " pod="openshift-image-registry/node-ca-67qqz" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-var-lib-openvswitch\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-var-lib-openvswitch\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a0c68c2-fd6c-49b2-bf04-84096034153e-ovn-node-metrics-cert\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405762 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-var-lib-cni-bin\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-node-log\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8-host\") pod \"node-ca-67qqz\" (UID: \"6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8\") " pod="openshift-image-registry/node-ca-67qqz" Apr 22 18:42:16.405829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-systemd-units\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405882 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-cni-netd\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405908 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsdrf\" (UniqueName: \"kubernetes.io/projected/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-kube-api-access-qsdrf\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405960 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-cnibin\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.405985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-run-systemd\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406009 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-cnibin\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-multus-cni-dir\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c557042d-06c7-4315-8e35-0884cd906ef9-cni-binary-copy\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-run-multus-certs\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3f34c1a3-3713-45c0-b770-cec5862c620d-konnectivity-ca\") pod \"konnectivity-agent-7n89x\" (UID: \"3f34c1a3-3713-45c0-b770-cec5862c620d\") " pod="kube-system/konnectivity-agent-7n89x" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2a0c68c2-fd6c-49b2-bf04-84096034153e-ovnkube-config\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.406445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406398 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-cni-netd\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406557 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt5d7\" (UniqueName: \"kubernetes.io/projected/2a0c68c2-fd6c-49b2-bf04-84096034153e-kube-api-access-vt5d7\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406598 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406626 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3f34c1a3-3713-45c0-b770-cec5862c620d-agent-certs\") pod \"konnectivity-agent-7n89x\" (UID: \"3f34c1a3-3713-45c0-b770-cec5862c620d\") " pod="kube-system/konnectivity-agent-7n89x" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406659 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-hostroot\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406676 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-cni-bin\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406706 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-run-netns\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r84b5\" (UniqueName: \"kubernetes.io/projected/c557042d-06c7-4315-8e35-0884cd906ef9-kube-api-access-r84b5\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-run-openvswitch\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406769 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-run-ovn\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406787 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-run-multus-certs\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406808 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-os-release\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.406995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-os-release\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.407686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407023 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2a0c68c2-fd6c-49b2-bf04-84096034153e-env-overrides\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.407686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407088 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-log-socket\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.407686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-node-log\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.407686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407157 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-cnibin\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.407686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407174 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.407686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407220 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-run-systemd\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.407686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407220 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-cnibin\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.407686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407342 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-multus-cni-dir\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.407686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.406595 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.407686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407608 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2a0c68c2-fd6c-49b2-bf04-84096034153e-ovnkube-script-lib\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.407686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407685 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8-host\") pod \"node-ca-67qqz\" (UID: \"6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8\") " pod="openshift-image-registry/node-ca-67qqz" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-multus-conf-dir\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407746 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-cni-binary-copy\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407776 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-slash\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-run-k8s-cni-cncf-io\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407816 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-os-release\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407819 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c557042d-06c7-4315-8e35-0884cd906ef9-cni-binary-copy\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407854 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-etc-kubernetes\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407867 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-host-var-lib-kubelet\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407890 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-system-cni-dir\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407889 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-multus-socket-dir-parent\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407920 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c557042d-06c7-4315-8e35-0884cd906ef9-system-cni-dir\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407921 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-systemd-units\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-host-cni-bin\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-run-openvswitch\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.407980 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a0c68c2-fd6c-49b2-bf04-84096034153e-run-ovn\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.408183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.408032 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.408960 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.408403 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c557042d-06c7-4315-8e35-0884cd906ef9-multus-daemon-config\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.410022 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.409989 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a0c68c2-fd6c-49b2-bf04-84096034153e-ovn-node-metrics-cert\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.411408 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.411383 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3f34c1a3-3713-45c0-b770-cec5862c620d-agent-certs\") pod \"konnectivity-agent-7n89x\" (UID: \"3f34c1a3-3713-45c0-b770-cec5862c620d\") " pod="kube-system/konnectivity-agent-7n89x" Apr 22 18:42:16.439466 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.439444 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktbnh\" (UniqueName: \"kubernetes.io/projected/6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8-kube-api-access-ktbnh\") pod \"node-ca-67qqz\" (UID: \"6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8\") " pod="openshift-image-registry/node-ca-67qqz" Apr 22 18:42:16.439704 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.439681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsdrf\" (UniqueName: \"kubernetes.io/projected/ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68-kube-api-access-qsdrf\") pod \"multus-additional-cni-plugins-c4hqs\" (UID: \"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68\") " pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.441139 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.441122 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt5d7\" (UniqueName: \"kubernetes.io/projected/2a0c68c2-fd6c-49b2-bf04-84096034153e-kube-api-access-vt5d7\") pod \"ovnkube-node-sr4r9\" (UID: \"2a0c68c2-fd6c-49b2-bf04-84096034153e\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.444245 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.444228 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r84b5\" (UniqueName: \"kubernetes.io/projected/c557042d-06c7-4315-8e35-0884cd906ef9-kube-api-access-r84b5\") pod \"multus-x5q6c\" (UID: \"c557042d-06c7-4315-8e35-0884cd906ef9\") " pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.499243 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.499213 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4cgd8" Apr 22 18:42:16.507794 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.507282 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" Apr 22 18:42:16.514457 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.514429 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4l828" Apr 22 18:42:16.518994 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.518973 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-67qqz" Apr 22 18:42:16.526570 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.526550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c4hqs" Apr 22 18:42:16.533119 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.533096 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x5q6c" Apr 22 18:42:16.540601 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.540582 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:16.548116 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.548096 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7n89x" Apr 22 18:42:16.809193 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.809103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:16.809347 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.809237 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:16.809347 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.809304 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs podName:eaf5856d-69b7-4e87-b07c-f8ea8eed1048 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:17.809287866 +0000 UTC m=+4.051628245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs") pod "network-metrics-daemon-l6jrz" (UID: "eaf5856d-69b7-4e87-b07c-f8ea8eed1048") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:16.909671 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:16.909613 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdt8v\" (UniqueName: \"kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v\") pod \"network-check-target-mbl6x\" (UID: \"4790d6e1-aad7-43f0-95f6-04dde16468f8\") " pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:16.909851 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.909787 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:16.909851 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.909813 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:16.909851 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.909825 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jdt8v for pod openshift-network-diagnostics/network-check-target-mbl6x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:16.910001 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:16.909895 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v podName:4790d6e1-aad7-43f0-95f6-04dde16468f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:17.909877092 +0000 UTC m=+4.152217472 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jdt8v" (UniqueName: "kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v") pod "network-check-target-mbl6x" (UID: "4790d6e1-aad7-43f0-95f6-04dde16468f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:17.056753 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:17.056720 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a3096c0_28bb_48d6_a4f9_f3fc9bf195d8.slice/crio-67395dc3458062c7f477f0089fc8b789a80d88265dfaeaeb3d90e58b882fe18e WatchSource:0}: Error finding container 67395dc3458062c7f477f0089fc8b789a80d88265dfaeaeb3d90e58b882fe18e: Status 404 returned error can't find the container with id 67395dc3458062c7f477f0089fc8b789a80d88265dfaeaeb3d90e58b882fe18e Apr 22 18:42:17.061802 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:17.061780 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae80bd70_7d80_4b8b_a4e9_5728e1e5fe68.slice/crio-29097e7e54921d02bdc047ebe56adaea57e74cc5c53fad37e3284b233abdf531 WatchSource:0}: Error finding container 29097e7e54921d02bdc047ebe56adaea57e74cc5c53fad37e3284b233abdf531: Status 404 returned error can't find the container with id 29097e7e54921d02bdc047ebe56adaea57e74cc5c53fad37e3284b233abdf531 Apr 22 18:42:17.063362 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:17.063309 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod644b7849_60c1_43db_80c0_b41ef5e73b3f.slice/crio-e0332dcd08d9fad0be65a788ac54e3df9c316de5de3ca0529933296e60086ae8 WatchSource:0}: Error finding container e0332dcd08d9fad0be65a788ac54e3df9c316de5de3ca0529933296e60086ae8: Status 404 returned error can't find the container with id e0332dcd08d9fad0be65a788ac54e3df9c316de5de3ca0529933296e60086ae8 Apr 22 18:42:17.065905 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:17.065879 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d4cda14_9eb0_451a_95dd_098981e2a91b.slice/crio-1b411c6bfcef6988b376a6cb804d065f9901383c0cb02c89ad49ed4dc227d53a WatchSource:0}: Error finding container 1b411c6bfcef6988b376a6cb804d065f9901383c0cb02c89ad49ed4dc227d53a: Status 404 returned error can't find the container with id 1b411c6bfcef6988b376a6cb804d065f9901383c0cb02c89ad49ed4dc227d53a Apr 22 18:42:17.066677 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:17.066651 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod743a3f4b_8a0a_4113_a780_f4f12d4c5db5.slice/crio-6a631a26295f3a069597836ac803fb0c42507794db32d11305b0d4392649895f WatchSource:0}: Error finding container 6a631a26295f3a069597836ac803fb0c42507794db32d11305b0d4392649895f: Status 404 returned error can't find the container with id 6a631a26295f3a069597836ac803fb0c42507794db32d11305b0d4392649895f Apr 22 18:42:17.069070 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:17.069053 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc557042d_06c7_4315_8e35_0884cd906ef9.slice/crio-a4100f3d8f251c05a91f911ff79bf6e469ac5f45e7166735f96b405d5cfba469 WatchSource:0}: Error finding container a4100f3d8f251c05a91f911ff79bf6e469ac5f45e7166735f96b405d5cfba469: Status 404 returned error can't find the container with id a4100f3d8f251c05a91f911ff79bf6e469ac5f45e7166735f96b405d5cfba469 Apr 22 18:42:17.089535 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:17.089510 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f34c1a3_3713_45c0_b770_cec5862c620d.slice/crio-46aaa9e71c4e703f0b7a75f45b2d56e6f227251aa15e82cba4f4f69c8baf32d1 WatchSource:0}: Error finding container 46aaa9e71c4e703f0b7a75f45b2d56e6f227251aa15e82cba4f4f69c8baf32d1: Status 404 returned error can't find the container with id 46aaa9e71c4e703f0b7a75f45b2d56e6f227251aa15e82cba4f4f69c8baf32d1 Apr 22 18:42:17.090148 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:17.090126 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a0c68c2_fd6c_49b2_bf04_84096034153e.slice/crio-c74edbf147cc97046f7d8881397945f5a3f351f359405a8acfcffd7a2e6d29af WatchSource:0}: Error finding container c74edbf147cc97046f7d8881397945f5a3f351f359405a8acfcffd7a2e6d29af: Status 404 returned error can't find the container with id c74edbf147cc97046f7d8881397945f5a3f351f359405a8acfcffd7a2e6d29af Apr 22 18:42:17.243488 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.243456 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:37:15 +0000 UTC" deadline="2027-10-25 02:35:37.00879074 +0000 UTC" Apr 22 18:42:17.243488 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.243483 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13207h53m19.765310854s" Apr 22 18:42:17.303014 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.302982 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-85.ec2.internal" event={"ID":"79daf747565244a11b1a61e38cc6d0df","Type":"ContainerStarted","Data":"987304071ae5ba7f1752380ca7658e69f12142a2cc019eec27699063baa843d1"} Apr 22 18:42:17.303973 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.303947 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7n89x" event={"ID":"3f34c1a3-3713-45c0-b770-cec5862c620d","Type":"ContainerStarted","Data":"46aaa9e71c4e703f0b7a75f45b2d56e6f227251aa15e82cba4f4f69c8baf32d1"} Apr 22 18:42:17.305993 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.305974 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" event={"ID":"743a3f4b-8a0a-4113-a780-f4f12d4c5db5","Type":"ContainerStarted","Data":"6a631a26295f3a069597836ac803fb0c42507794db32d11305b0d4392649895f"} Apr 22 18:42:17.307032 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.307012 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c4hqs" event={"ID":"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68","Type":"ContainerStarted","Data":"29097e7e54921d02bdc047ebe56adaea57e74cc5c53fad37e3284b233abdf531"} Apr 22 18:42:17.308555 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.308535 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4l828" event={"ID":"644b7849-60c1-43db-80c0-b41ef5e73b3f","Type":"ContainerStarted","Data":"e0332dcd08d9fad0be65a788ac54e3df9c316de5de3ca0529933296e60086ae8"} Apr 22 18:42:17.311124 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.311018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-67qqz" event={"ID":"6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8","Type":"ContainerStarted","Data":"67395dc3458062c7f477f0089fc8b789a80d88265dfaeaeb3d90e58b882fe18e"} Apr 22 18:42:17.313061 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.313035 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" event={"ID":"2a0c68c2-fd6c-49b2-bf04-84096034153e","Type":"ContainerStarted","Data":"c74edbf147cc97046f7d8881397945f5a3f351f359405a8acfcffd7a2e6d29af"} Apr 22 18:42:17.314404 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.314376 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x5q6c" event={"ID":"c557042d-06c7-4315-8e35-0884cd906ef9","Type":"ContainerStarted","Data":"a4100f3d8f251c05a91f911ff79bf6e469ac5f45e7166735f96b405d5cfba469"} Apr 22 18:42:17.316892 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.316874 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4cgd8" event={"ID":"9d4cda14-9eb0-451a-95dd-098981e2a91b","Type":"ContainerStarted","Data":"1b411c6bfcef6988b376a6cb804d065f9901383c0cb02c89ad49ed4dc227d53a"} Apr 22 18:42:17.321761 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.321726 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-85.ec2.internal" podStartSLOduration=2.321714847 podStartE2EDuration="2.321714847s" podCreationTimestamp="2026-04-22 18:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:42:17.321585965 +0000 UTC m=+3.563926367" watchObservedRunningTime="2026-04-22 18:42:17.321714847 +0000 UTC m=+3.564055245" Apr 22 18:42:17.573533 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.573444 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-tdb8c"] Apr 22 18:42:17.576165 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.576141 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:17.576311 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:17.576217 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:17.615822 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.615585 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/83ef0e88-f056-4378-862a-0cd9fcd367d1-kubelet-config\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:17.615822 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.615649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:17.615822 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.615755 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/83ef0e88-f056-4378-862a-0cd9fcd367d1-dbus\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:17.717059 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.716532 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/83ef0e88-f056-4378-862a-0cd9fcd367d1-dbus\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:17.717059 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.716599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/83ef0e88-f056-4378-862a-0cd9fcd367d1-kubelet-config\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:17.717059 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.716622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:17.717059 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.716731 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/83ef0e88-f056-4378-862a-0cd9fcd367d1-dbus\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:17.717059 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:17.716771 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:17.717059 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.716791 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/83ef0e88-f056-4378-862a-0cd9fcd367d1-kubelet-config\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:17.717059 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:17.716829 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret podName:83ef0e88-f056-4378-862a-0cd9fcd367d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:18.216809997 +0000 UTC m=+4.459150375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret") pod "global-pull-secret-syncer-tdb8c" (UID: "83ef0e88-f056-4378-862a-0cd9fcd367d1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:17.817202 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.817164 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:17.817384 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:17.817319 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:17.817455 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:17.817393 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs podName:eaf5856d-69b7-4e87-b07c-f8ea8eed1048 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:19.817373629 +0000 UTC m=+6.059714007 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs") pod "network-metrics-daemon-l6jrz" (UID: "eaf5856d-69b7-4e87-b07c-f8ea8eed1048") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:17.918249 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:17.918156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdt8v\" (UniqueName: \"kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v\") pod \"network-check-target-mbl6x\" (UID: \"4790d6e1-aad7-43f0-95f6-04dde16468f8\") " pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:17.918411 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:17.918359 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:17.918411 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:17.918378 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:17.918411 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:17.918392 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jdt8v for pod openshift-network-diagnostics/network-check-target-mbl6x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:17.918571 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:17.918449 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v podName:4790d6e1-aad7-43f0-95f6-04dde16468f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:19.918430649 +0000 UTC m=+6.160771031 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jdt8v" (UniqueName: "kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v") pod "network-check-target-mbl6x" (UID: "4790d6e1-aad7-43f0-95f6-04dde16468f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:18.220212 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:18.220146 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:18.220384 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:18.220351 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:18.220438 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:18.220417 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret podName:83ef0e88-f056-4378-862a-0cd9fcd367d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:19.220396979 +0000 UTC m=+5.462737370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret") pod "global-pull-secret-syncer-tdb8c" (UID: "83ef0e88-f056-4378-862a-0cd9fcd367d1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:18.295280 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:18.294793 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:18.295280 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:18.294934 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:18.295280 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:18.294990 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:18.295280 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:18.295102 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:18.323884 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:18.323846 2578 generic.go:358] "Generic (PLEG): container finished" podID="c535ba14e14c867f6ac9b6b8e0cc4308" containerID="a33eebaa3098c2c18cbd2074b907519c880ba28b882f151f403e923c4b349ff1" exitCode=0 Apr 22 18:42:18.324775 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:18.324748 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" event={"ID":"c535ba14e14c867f6ac9b6b8e0cc4308","Type":"ContainerDied","Data":"a33eebaa3098c2c18cbd2074b907519c880ba28b882f151f403e923c4b349ff1"} Apr 22 18:42:19.232120 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:19.232067 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:19.232386 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:19.232366 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:19.232465 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:19.232435 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret podName:83ef0e88-f056-4378-862a-0cd9fcd367d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:21.232417153 +0000 UTC m=+7.474757530 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret") pod "global-pull-secret-syncer-tdb8c" (UID: "83ef0e88-f056-4378-862a-0cd9fcd367d1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:19.293528 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:19.293492 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:19.293710 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:19.293612 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:19.347857 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:19.347822 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" event={"ID":"c535ba14e14c867f6ac9b6b8e0cc4308","Type":"ContainerStarted","Data":"aa07a0179351cc02b889e4126300645d76545b2adfb743852efd0d707c897522"} Apr 22 18:42:19.838186 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:19.837553 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:19.838186 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:19.837773 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:19.838186 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:19.837835 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs podName:eaf5856d-69b7-4e87-b07c-f8ea8eed1048 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:23.837816648 +0000 UTC m=+10.080157037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs") pod "network-metrics-daemon-l6jrz" (UID: "eaf5856d-69b7-4e87-b07c-f8ea8eed1048") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:19.938855 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:19.938820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdt8v\" (UniqueName: \"kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v\") pod \"network-check-target-mbl6x\" (UID: \"4790d6e1-aad7-43f0-95f6-04dde16468f8\") " pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:19.939017 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:19.938991 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:19.939017 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:19.939008 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:19.939131 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:19.939019 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jdt8v for pod openshift-network-diagnostics/network-check-target-mbl6x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:19.939131 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:19.939078 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v podName:4790d6e1-aad7-43f0-95f6-04dde16468f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:23.939059713 +0000 UTC m=+10.181400095 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jdt8v" (UniqueName: "kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v") pod "network-check-target-mbl6x" (UID: "4790d6e1-aad7-43f0-95f6-04dde16468f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:20.297549 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:20.297009 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:20.297549 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:20.297091 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:20.297549 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:20.297406 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:20.297549 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:20.297497 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:21.250803 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:21.250766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:21.251244 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:21.250961 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:21.251244 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:21.251021 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret podName:83ef0e88-f056-4378-862a-0cd9fcd367d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:25.251002609 +0000 UTC m=+11.493342993 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret") pod "global-pull-secret-syncer-tdb8c" (UID: "83ef0e88-f056-4378-862a-0cd9fcd367d1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:21.293593 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:21.293558 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:21.293769 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:21.293702 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:22.293773 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:22.293728 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:22.294206 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:22.293867 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:22.296823 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:22.296608 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:22.296823 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:22.296756 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:23.293911 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:23.293871 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:23.294330 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:23.294009 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:23.874009 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:23.873974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:23.874215 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:23.874129 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:23.874215 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:23.874184 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs podName:eaf5856d-69b7-4e87-b07c-f8ea8eed1048 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:31.874171274 +0000 UTC m=+18.116511650 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs") pod "network-metrics-daemon-l6jrz" (UID: "eaf5856d-69b7-4e87-b07c-f8ea8eed1048") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:23.974987 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:23.974433 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdt8v\" (UniqueName: \"kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v\") pod \"network-check-target-mbl6x\" (UID: \"4790d6e1-aad7-43f0-95f6-04dde16468f8\") " pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:23.974987 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:23.974598 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:23.974987 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:23.974614 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:23.974987 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:23.974627 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jdt8v for pod openshift-network-diagnostics/network-check-target-mbl6x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:23.974987 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:23.974697 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v podName:4790d6e1-aad7-43f0-95f6-04dde16468f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:31.974683272 +0000 UTC m=+18.217023649 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jdt8v" (UniqueName: "kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v") pod "network-check-target-mbl6x" (UID: "4790d6e1-aad7-43f0-95f6-04dde16468f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:24.293623 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:24.293589 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:24.293815 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:24.293749 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:24.294103 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:24.294074 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:24.294457 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:24.294165 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:25.287273 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:25.286697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:25.287273 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:25.286878 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:25.287273 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:25.286938 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret podName:83ef0e88-f056-4378-862a-0cd9fcd367d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:33.286920847 +0000 UTC m=+19.529261238 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret") pod "global-pull-secret-syncer-tdb8c" (UID: "83ef0e88-f056-4378-862a-0cd9fcd367d1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:25.293920 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:25.293890 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:25.294059 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:25.294019 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:26.297878 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:26.297788 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:26.298267 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:26.297788 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:26.298267 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:26.297925 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:26.298267 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:26.298007 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:27.294012 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:27.293980 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:27.294190 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:27.294101 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:28.296657 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:28.295989 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:28.296657 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:28.296110 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:28.296657 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:28.296511 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:28.296657 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:28.296588 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:29.294414 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.294377 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:29.294584 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:29.294497 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:29.533115 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.533067 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-85.ec2.internal" podStartSLOduration=14.533052414 podStartE2EDuration="14.533052414s" podCreationTimestamp="2026-04-22 18:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:42:19.364079472 +0000 UTC m=+5.606419872" watchObservedRunningTime="2026-04-22 18:42:29.533052414 +0000 UTC m=+15.775392813" Apr 22 18:42:29.533523 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.533407 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-b5q9v"] Apr 22 18:42:29.537297 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.537259 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b5q9v" Apr 22 18:42:29.541058 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.541033 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:42:29.541172 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.541059 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:42:29.541457 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.541438 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hztd2\"" Apr 22 18:42:29.622222 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.622138 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjwzc\" (UniqueName: \"kubernetes.io/projected/b4a78812-3844-4d64-b8b5-016856db881b-kube-api-access-jjwzc\") pod \"node-resolver-b5q9v\" (UID: \"b4a78812-3844-4d64-b8b5-016856db881b\") " pod="openshift-dns/node-resolver-b5q9v" Apr 22 18:42:29.622222 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.622181 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b4a78812-3844-4d64-b8b5-016856db881b-hosts-file\") pod \"node-resolver-b5q9v\" (UID: \"b4a78812-3844-4d64-b8b5-016856db881b\") " pod="openshift-dns/node-resolver-b5q9v" Apr 22 18:42:29.622222 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.622224 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b4a78812-3844-4d64-b8b5-016856db881b-tmp-dir\") pod \"node-resolver-b5q9v\" (UID: \"b4a78812-3844-4d64-b8b5-016856db881b\") " pod="openshift-dns/node-resolver-b5q9v" Apr 22 18:42:29.723255 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.723218 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjwzc\" (UniqueName: \"kubernetes.io/projected/b4a78812-3844-4d64-b8b5-016856db881b-kube-api-access-jjwzc\") pod \"node-resolver-b5q9v\" (UID: \"b4a78812-3844-4d64-b8b5-016856db881b\") " pod="openshift-dns/node-resolver-b5q9v" Apr 22 18:42:29.723255 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.723254 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b4a78812-3844-4d64-b8b5-016856db881b-hosts-file\") pod \"node-resolver-b5q9v\" (UID: \"b4a78812-3844-4d64-b8b5-016856db881b\") " pod="openshift-dns/node-resolver-b5q9v" Apr 22 18:42:29.723582 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.723284 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b4a78812-3844-4d64-b8b5-016856db881b-tmp-dir\") pod \"node-resolver-b5q9v\" (UID: \"b4a78812-3844-4d64-b8b5-016856db881b\") " pod="openshift-dns/node-resolver-b5q9v" Apr 22 18:42:29.723724 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.723700 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b4a78812-3844-4d64-b8b5-016856db881b-hosts-file\") pod \"node-resolver-b5q9v\" (UID: \"b4a78812-3844-4d64-b8b5-016856db881b\") " pod="openshift-dns/node-resolver-b5q9v" Apr 22 18:42:29.723787 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.723717 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b4a78812-3844-4d64-b8b5-016856db881b-tmp-dir\") pod \"node-resolver-b5q9v\" (UID: \"b4a78812-3844-4d64-b8b5-016856db881b\") " pod="openshift-dns/node-resolver-b5q9v" Apr 22 18:42:29.736176 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.736148 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjwzc\" (UniqueName: \"kubernetes.io/projected/b4a78812-3844-4d64-b8b5-016856db881b-kube-api-access-jjwzc\") pod \"node-resolver-b5q9v\" (UID: \"b4a78812-3844-4d64-b8b5-016856db881b\") " pod="openshift-dns/node-resolver-b5q9v" Apr 22 18:42:29.846184 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:29.846154 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b5q9v" Apr 22 18:42:30.296553 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:30.296520 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:30.296553 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:30.296551 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:30.296785 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:30.296676 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:30.296785 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:30.296774 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:31.294026 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:31.293990 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:31.294499 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:31.294090 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:31.938689 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:31.938612 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:31.938870 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:31.938774 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:31.938870 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:31.938838 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs podName:eaf5856d-69b7-4e87-b07c-f8ea8eed1048 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:47.93882264 +0000 UTC m=+34.181163017 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs") pod "network-metrics-daemon-l6jrz" (UID: "eaf5856d-69b7-4e87-b07c-f8ea8eed1048") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:32.039385 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:32.039352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdt8v\" (UniqueName: \"kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v\") pod \"network-check-target-mbl6x\" (UID: \"4790d6e1-aad7-43f0-95f6-04dde16468f8\") " pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:32.039574 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:32.039541 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:32.039574 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:32.039567 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:32.039716 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:32.039579 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jdt8v for pod openshift-network-diagnostics/network-check-target-mbl6x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:32.039716 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:32.039660 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v podName:4790d6e1-aad7-43f0-95f6-04dde16468f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:48.039628258 +0000 UTC m=+34.281968653 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jdt8v" (UniqueName: "kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v") pod "network-check-target-mbl6x" (UID: "4790d6e1-aad7-43f0-95f6-04dde16468f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:32.294085 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:32.294011 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:32.294085 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:32.294044 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:32.294546 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:32.294134 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:32.294546 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:32.294282 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:33.294264 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:33.294231 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:33.294722 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:33.294340 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:33.347681 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:33.347623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:33.347850 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:33.347796 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:33.347905 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:33.347863 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret podName:83ef0e88-f056-4378-862a-0cd9fcd367d1 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:49.347848341 +0000 UTC m=+35.590188719 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret") pod "global-pull-secret-syncer-tdb8c" (UID: "83ef0e88-f056-4378-862a-0cd9fcd367d1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:34.294970 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:34.294945 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:34.295296 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:34.295044 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:34.295296 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:34.295088 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:34.295296 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:34.295207 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:34.380770 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:34.380716 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b5q9v" event={"ID":"b4a78812-3844-4d64-b8b5-016856db881b","Type":"ContainerStarted","Data":"3cc82bd0f315ffe089b247df1aea7753bf5294132a4a8df1e1ae3d46c48290c6"} Apr 22 18:42:35.294162 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.293763 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:35.294304 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:35.294281 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:35.383847 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.383804 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68" containerID="5579e1c090bf5fd29bcced22374f77298eb6b1c9022616b8b68ef9d140f25eda" exitCode=0 Apr 22 18:42:35.384536 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.383904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c4hqs" event={"ID":"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68","Type":"ContainerDied","Data":"5579e1c090bf5fd29bcced22374f77298eb6b1c9022616b8b68ef9d140f25eda"} Apr 22 18:42:35.385282 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.385256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4l828" event={"ID":"644b7849-60c1-43db-80c0-b41ef5e73b3f","Type":"ContainerStarted","Data":"1dd5154c6f07937303547c908d80306707b0b80953cd221d4aaade12da375d46"} Apr 22 18:42:35.386717 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.386687 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-67qqz" event={"ID":"6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8","Type":"ContainerStarted","Data":"fc27610ea03c663024eed32455ba905a5290b66b651687c94b8c4646f97e4593"} Apr 22 18:42:35.388116 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.388093 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b5q9v" event={"ID":"b4a78812-3844-4d64-b8b5-016856db881b","Type":"ContainerStarted","Data":"4e7bd6e7d4cd5a7a4213f66571bb1b2b7e4d671fb4701fdffbd7cce8225386f8"} Apr 22 18:42:35.393709 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.393632 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 18:42:35.394013 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.393970 2578 generic.go:358] "Generic (PLEG): container finished" podID="2a0c68c2-fd6c-49b2-bf04-84096034153e" containerID="6988c6f500e0e1c0de1296defb19781273bb68495434e4f2cac6f245045d4711" exitCode=1 Apr 22 18:42:35.394095 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.394034 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" event={"ID":"2a0c68c2-fd6c-49b2-bf04-84096034153e","Type":"ContainerStarted","Data":"163b4f4e66df96ecaa3c9848b76ad5c844970b7caba8453802c73e688984a6dd"} Apr 22 18:42:35.394095 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.394055 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" event={"ID":"2a0c68c2-fd6c-49b2-bf04-84096034153e","Type":"ContainerStarted","Data":"cb2137f1fa83816127ac9f88a3d9173965bb89ae00d1334f3bb837709f5eda27"} Apr 22 18:42:35.394095 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.394068 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" event={"ID":"2a0c68c2-fd6c-49b2-bf04-84096034153e","Type":"ContainerStarted","Data":"559d60f0d2f37c71f8b140761f3ee2666972e6ff2222d9ce4d390c861f6189aa"} Apr 22 18:42:35.394095 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.394076 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" event={"ID":"2a0c68c2-fd6c-49b2-bf04-84096034153e","Type":"ContainerStarted","Data":"d50d2433036d45d8d99fe2dee8a990dfb14e6338f733f33a809a2708df9bd42e"} Apr 22 18:42:35.394095 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.394083 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" event={"ID":"2a0c68c2-fd6c-49b2-bf04-84096034153e","Type":"ContainerDied","Data":"6988c6f500e0e1c0de1296defb19781273bb68495434e4f2cac6f245045d4711"} Apr 22 18:42:35.394095 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.394096 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" event={"ID":"2a0c68c2-fd6c-49b2-bf04-84096034153e","Type":"ContainerStarted","Data":"0eaf497b9ba555afc1e0aef3c397f8bb1f97da9072bca3894275d9f538d16d7a"} Apr 22 18:42:35.395403 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.395380 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x5q6c" event={"ID":"c557042d-06c7-4315-8e35-0884cd906ef9","Type":"ContainerStarted","Data":"ac282dfbab3d0354e32ca2099854770873031b5e2fe62b20c956ff61ee979582"} Apr 22 18:42:35.396774 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.396745 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7n89x" event={"ID":"3f34c1a3-3713-45c0-b770-cec5862c620d","Type":"ContainerStarted","Data":"facb45044a93e6501a3e1e96d3224310e9954936c079069c6ada6493157362b8"} Apr 22 18:42:35.398071 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.398053 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" event={"ID":"743a3f4b-8a0a-4113-a780-f4f12d4c5db5","Type":"ContainerStarted","Data":"1a5c1adf5ad31d826d42dc207befb7fbe27ee47a0d62f445289985e863681132"} Apr 22 18:42:35.423518 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.423469 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4l828" podStartSLOduration=4.244778668 podStartE2EDuration="21.423454537s" podCreationTimestamp="2026-04-22 18:42:14 +0000 UTC" firstStartedPulling="2026-04-22 18:42:17.064986167 +0000 UTC m=+3.307326543" lastFinishedPulling="2026-04-22 18:42:34.243662031 +0000 UTC m=+20.486002412" observedRunningTime="2026-04-22 18:42:35.423409982 +0000 UTC m=+21.665750381" watchObservedRunningTime="2026-04-22 18:42:35.423454537 +0000 UTC m=+21.665794937" Apr 22 18:42:35.454883 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.454827 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-67qqz" podStartSLOduration=4.271867199 podStartE2EDuration="21.454812443s" podCreationTimestamp="2026-04-22 18:42:14 +0000 UTC" firstStartedPulling="2026-04-22 18:42:17.060763194 +0000 UTC m=+3.303103571" lastFinishedPulling="2026-04-22 18:42:34.243708432 +0000 UTC m=+20.486048815" observedRunningTime="2026-04-22 18:42:35.454776018 +0000 UTC m=+21.697116421" watchObservedRunningTime="2026-04-22 18:42:35.454812443 +0000 UTC m=+21.697152843" Apr 22 18:42:35.455041 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.455018 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-x5q6c" podStartSLOduration=4.258530834 podStartE2EDuration="21.455011425s" podCreationTimestamp="2026-04-22 18:42:14 +0000 UTC" firstStartedPulling="2026-04-22 18:42:17.088346495 +0000 UTC m=+3.330686871" lastFinishedPulling="2026-04-22 18:42:34.284827071 +0000 UTC m=+20.527167462" observedRunningTime="2026-04-22 18:42:35.44018996 +0000 UTC m=+21.682530360" watchObservedRunningTime="2026-04-22 18:42:35.455011425 +0000 UTC m=+21.697351823" Apr 22 18:42:35.472680 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.472613 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7n89x" podStartSLOduration=9.086096007 podStartE2EDuration="21.472602349s" podCreationTimestamp="2026-04-22 18:42:14 +0000 UTC" firstStartedPulling="2026-04-22 18:42:17.092571779 +0000 UTC m=+3.334912159" lastFinishedPulling="2026-04-22 18:42:29.479078109 +0000 UTC m=+15.721418501" observedRunningTime="2026-04-22 18:42:35.472302304 +0000 UTC m=+21.714642705" watchObservedRunningTime="2026-04-22 18:42:35.472602349 +0000 UTC m=+21.714942748" Apr 22 18:42:35.493687 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.493625 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-b5q9v" podStartSLOduration=6.493612979 podStartE2EDuration="6.493612979s" podCreationTimestamp="2026-04-22 18:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:42:35.493126962 +0000 UTC m=+21.735467361" watchObservedRunningTime="2026-04-22 18:42:35.493612979 +0000 UTC m=+21.735953378" Apr 22 18:42:35.985363 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:35.985302 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:42:36.270678 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:36.270523 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:42:35.985357677Z","UUID":"5a79cec9-f07e-4192-ba4e-017644dd870f","Handler":null,"Name":"","Endpoint":""} Apr 22 18:42:36.273012 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:36.272974 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:42:36.273129 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:36.273020 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:42:36.293603 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:36.293575 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:36.293735 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:36.293709 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:36.293806 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:36.293781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:36.293944 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:36.293905 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:36.401945 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:36.401857 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" event={"ID":"743a3f4b-8a0a-4113-a780-f4f12d4c5db5","Type":"ContainerStarted","Data":"dd9f537c0649b111e769a7c8690c3050d67910ae53fa3b49bfa9bacd6443378a"} Apr 22 18:42:36.403305 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:36.403249 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4cgd8" event={"ID":"9d4cda14-9eb0-451a-95dd-098981e2a91b","Type":"ContainerStarted","Data":"e19bf01c8bf2eb163376dc352384ca61c554c90ca5ed93e96a87bdcaaba16b56"} Apr 22 18:42:36.432516 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:36.432471 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4cgd8" podStartSLOduration=5.256792181 podStartE2EDuration="22.432457185s" podCreationTimestamp="2026-04-22 18:42:14 +0000 UTC" firstStartedPulling="2026-04-22 18:42:17.068012 +0000 UTC m=+3.310352378" lastFinishedPulling="2026-04-22 18:42:34.243677001 +0000 UTC m=+20.486017382" observedRunningTime="2026-04-22 18:42:36.431907772 +0000 UTC m=+22.674248172" watchObservedRunningTime="2026-04-22 18:42:36.432457185 +0000 UTC m=+22.674797621" Apr 22 18:42:37.293838 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:37.293750 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:37.294001 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:37.293889 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:38.294251 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:38.294076 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:38.294825 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:38.294138 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:38.294825 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:38.294358 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:38.294825 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:38.294390 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:38.409192 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:38.409151 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" event={"ID":"743a3f4b-8a0a-4113-a780-f4f12d4c5db5","Type":"ContainerStarted","Data":"a9f9d1f693d532a6dac1c3f0f1ebcb61b16122037c3b1af6c94ec7483b73141d"} Apr 22 18:42:38.412149 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:38.412126 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 18:42:38.412510 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:38.412459 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" event={"ID":"2a0c68c2-fd6c-49b2-bf04-84096034153e","Type":"ContainerStarted","Data":"28cc765c64a01a5d2a47348d07447a909c8ddbefbbad87d94fb74ffa18a8bfea"} Apr 22 18:42:38.427450 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:38.427404 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8kkh5" podStartSLOduration=3.7337140140000002 podStartE2EDuration="24.427392916s" podCreationTimestamp="2026-04-22 18:42:14 +0000 UTC" firstStartedPulling="2026-04-22 18:42:17.06858349 +0000 UTC m=+3.310923867" lastFinishedPulling="2026-04-22 18:42:37.762262388 +0000 UTC m=+24.004602769" observedRunningTime="2026-04-22 18:42:38.426977163 +0000 UTC m=+24.669317584" watchObservedRunningTime="2026-04-22 18:42:38.427392916 +0000 UTC m=+24.669733315" Apr 22 18:42:39.294379 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:39.294349 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:39.294859 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:39.294461 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:39.429041 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:39.429011 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7n89x" Apr 22 18:42:39.429665 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:39.429625 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7n89x" Apr 22 18:42:40.293764 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:40.293581 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:40.293935 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:40.293581 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:40.293935 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:40.293905 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:40.293935 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:40.293824 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:40.417987 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:40.417960 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68" containerID="be8bcfbe2b66ec15455cb2063cd28d05d8c0538b31dd4e0c2315d483db8a21a8" exitCode=0 Apr 22 18:42:40.418671 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:40.418036 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c4hqs" event={"ID":"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68","Type":"ContainerDied","Data":"be8bcfbe2b66ec15455cb2063cd28d05d8c0538b31dd4e0c2315d483db8a21a8"} Apr 22 18:42:40.421227 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:40.421209 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 18:42:40.421569 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:40.421541 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" event={"ID":"2a0c68c2-fd6c-49b2-bf04-84096034153e","Type":"ContainerStarted","Data":"d775476875cd8d9b166e6f196a8196b24f55459addb0906d055032119f9b8f84"} Apr 22 18:42:40.421788 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:40.421768 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7n89x" Apr 22 18:42:40.421832 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:40.421801 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:40.422029 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:40.422006 2578 scope.go:117] "RemoveContainer" containerID="6988c6f500e0e1c0de1296defb19781273bb68495434e4f2cac6f245045d4711" Apr 22 18:42:40.422240 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:40.422228 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7n89x" Apr 22 18:42:40.437709 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:40.437692 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:40.911688 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:40.911658 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:41.294198 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.294158 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:41.294379 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:41.294314 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:41.425867 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.425835 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68" containerID="c1413c5fa49f102e0e1f69434775c7cde951d7c9c6a0fa6a0e936661f2f97ea9" exitCode=0 Apr 22 18:42:41.426286 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.425903 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c4hqs" event={"ID":"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68","Type":"ContainerDied","Data":"c1413c5fa49f102e0e1f69434775c7cde951d7c9c6a0fa6a0e936661f2f97ea9"} Apr 22 18:42:41.430237 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.430211 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 18:42:41.430691 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.430667 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" event={"ID":"2a0c68c2-fd6c-49b2-bf04-84096034153e","Type":"ContainerStarted","Data":"2664bfdfd26b62ecff956bf57672f551505859fbcc04975995afcaaa2e1c5be8"} Apr 22 18:42:41.431037 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.431011 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:41.447369 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.447335 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:42:41.485803 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.485705 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" podStartSLOduration=10.24382896 podStartE2EDuration="27.485693082s" podCreationTimestamp="2026-04-22 18:42:14 +0000 UTC" firstStartedPulling="2026-04-22 18:42:17.092614377 +0000 UTC m=+3.334954756" lastFinishedPulling="2026-04-22 18:42:34.334478501 +0000 UTC m=+20.576818878" observedRunningTime="2026-04-22 18:42:41.485342852 +0000 UTC m=+27.727683252" watchObservedRunningTime="2026-04-22 18:42:41.485693082 +0000 UTC m=+27.728033480" Apr 22 18:42:41.653504 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.653475 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mbl6x"] Apr 22 18:42:41.653783 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.653591 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:41.653783 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:41.653681 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:41.657019 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.656993 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tdb8c"] Apr 22 18:42:41.657145 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.657090 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:41.657206 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:41.657186 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:41.657592 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.657571 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l6jrz"] Apr 22 18:42:41.657699 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:41.657689 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:41.657801 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:41.657786 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:42.434775 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:42.434743 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68" containerID="3f325b9aee3d31cbd8a1d5ca0be4db77d40f6202977eabfcc48382d8cd3e3996" exitCode=0 Apr 22 18:42:42.435142 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:42.434825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c4hqs" event={"ID":"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68","Type":"ContainerDied","Data":"3f325b9aee3d31cbd8a1d5ca0be4db77d40f6202977eabfcc48382d8cd3e3996"} Apr 22 18:42:43.293740 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:43.293708 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:43.293906 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:43.293708 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:43.293906 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:43.293834 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:43.294015 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:43.293938 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:43.294015 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:43.293708 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:43.294111 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:43.294036 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:45.294016 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:45.293836 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:45.294464 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:45.293866 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:45.294464 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:45.294118 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:45.294464 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:45.293902 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:45.294464 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:45.294197 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:45.294464 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:45.294245 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:47.293741 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.293657 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:47.293741 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.293714 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:47.294274 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.293810 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mbl6x" podUID="4790d6e1-aad7-43f0-95f6-04dde16468f8" Apr 22 18:42:47.294274 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.293867 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:47.294274 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.293891 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:42:47.294274 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.293939 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tdb8c" podUID="83ef0e88-f056-4378-862a-0cd9fcd367d1" Apr 22 18:42:47.514411 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.514368 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-85.ec2.internal" event="NodeReady" Apr 22 18:42:47.514586 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.514523 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:42:47.560584 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.560501 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-c877b6db6-s25mx"] Apr 22 18:42:47.599070 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.599032 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qfncj"] Apr 22 18:42:47.599215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.599189 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.603986 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.603961 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:42:47.604342 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.604312 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:42:47.604473 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.604385 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5lrh7\"" Apr 22 18:42:47.610674 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.610614 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:42:47.610894 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.610874 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:42:47.615394 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.614654 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6"] Apr 22 18:42:47.615394 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.614757 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:42:47.617686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.617398 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fch7b\"" Apr 22 18:42:47.617686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.617442 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 18:42:47.617686 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.617415 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 18:42:47.636002 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.635973 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l"] Apr 22 18:42:47.636161 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.636133 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" Apr 22 18:42:47.638963 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.638937 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-trrtp\"" Apr 22 18:42:47.639162 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.639143 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:42:47.639396 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.639381 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 18:42:47.639624 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.639604 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:42:47.640246 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.640229 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:42:47.663983 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.663958 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d"] Apr 22 18:42:47.664137 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.664114 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:47.666894 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.666871 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 18:42:47.683099 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.683073 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gtx4b"] Apr 22 18:42:47.683226 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.683212 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.685732 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.685712 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 18:42:47.685841 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.685712 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 18:42:47.685841 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.685771 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 18:42:47.685965 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.685877 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 18:42:47.710341 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.710298 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c877b6db6-s25mx"] Apr 22 18:42:47.710461 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.710350 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qfncj"] Apr 22 18:42:47.710461 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.710363 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d"] Apr 22 18:42:47.710461 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.710373 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l"] Apr 22 18:42:47.710461 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.710383 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6"] Apr 22 18:42:47.710461 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.710396 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nmmxl"] Apr 22 18:42:47.710461 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.710455 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:47.713540 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.713523 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:42:47.713672 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.713603 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5kx7g\"" Apr 22 18:42:47.714013 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.713997 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:42:47.722972 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.722954 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gtx4b"] Apr 22 18:42:47.723057 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.722975 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nmmxl"] Apr 22 18:42:47.723057 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.723045 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:42:47.725857 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.725836 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:42:47.725965 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.725903 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vrbrj\"" Apr 22 18:42:47.726023 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.725973 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:42:47.726071 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.726041 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:42:47.759623 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.759595 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/505044c7-242a-4290-ab95-7e778348f684-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:42:47.759747 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.759626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2f6e9159-5066-4e91-b3cb-ccc1d827eece-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.759747 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.759730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5189531f-23d4-4c5f-86e8-91035f932038-trusted-ca\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.759829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.759770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5189531f-23d4-4c5f-86e8-91035f932038-installation-pull-secrets\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.759829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.759797 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-bound-sa-token\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.759914 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.759837 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2f6e9159-5066-4e91-b3cb-ccc1d827eece-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.759914 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.759875 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsxws\" (UniqueName: \"kubernetes.io/projected/2f6e9159-5066-4e91-b3cb-ccc1d827eece-kube-api-access-vsxws\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.760013 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.759947 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5189531f-23d4-4c5f-86e8-91035f932038-registry-certificates\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.760013 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.759994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssnch\" (UniqueName: \"kubernetes.io/projected/585763e4-02be-466c-b994-4f49efab07bd-kube-api-access-ssnch\") pod \"klusterlet-addon-workmgr-6897d948bb-skr9l\" (UID: \"585763e4-02be-466c-b994-4f49efab07bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:47.760090 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.760023 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:42:47.760090 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.760046 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/585763e4-02be-466c-b994-4f49efab07bd-klusterlet-config\") pod \"klusterlet-addon-workmgr-6897d948bb-skr9l\" (UID: \"585763e4-02be-466c-b994-4f49efab07bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:47.760168 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.760090 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj8zb\" (UniqueName: \"kubernetes.io/projected/c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5-kube-api-access-xj8zb\") pod \"managed-serviceaccount-addon-agent-968b4dfc-2nkj6\" (UID: \"c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" Apr 22 18:42:47.760168 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.760118 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/585763e4-02be-466c-b994-4f49efab07bd-tmp\") pod \"klusterlet-addon-workmgr-6897d948bb-skr9l\" (UID: \"585763e4-02be-466c-b994-4f49efab07bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:47.760168 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.760147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2f6e9159-5066-4e91-b3cb-ccc1d827eece-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.760314 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.760169 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-968b4dfc-2nkj6\" (UID: \"c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" Apr 22 18:42:47.760314 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.760197 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2f6e9159-5066-4e91-b3cb-ccc1d827eece-ca\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.760314 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.760218 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgmx9\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-kube-api-access-kgmx9\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.760314 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.760247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2f6e9159-5066-4e91-b3cb-ccc1d827eece-hub\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.760460 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.760322 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5189531f-23d4-4c5f-86e8-91035f932038-image-registry-private-configuration\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.760460 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.760356 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.760460 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.760380 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5189531f-23d4-4c5f-86e8-91035f932038-ca-trust-extracted\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.861025 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.860952 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2f6e9159-5066-4e91-b3cb-ccc1d827eece-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.861025 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.860989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2dk5\" (UniqueName: \"kubernetes.io/projected/cd714644-718d-4a14-9b70-5b3aa5980856-kube-api-access-h2dk5\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:42:47.861025 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2f6e9159-5066-4e91-b3cb-ccc1d827eece-ca\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.861292 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861057 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:47.861292 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861105 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:42:47.861292 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5189531f-23d4-4c5f-86e8-91035f932038-image-registry-private-configuration\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.861292 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.861166 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:42:47.861292 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.861292 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861231 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5189531f-23d4-4c5f-86e8-91035f932038-ca-trust-extracted\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.861292 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssnch\" (UniqueName: \"kubernetes.io/projected/585763e4-02be-466c-b994-4f49efab07bd-kube-api-access-ssnch\") pod \"klusterlet-addon-workmgr-6897d948bb-skr9l\" (UID: \"585763e4-02be-466c-b994-4f49efab07bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:47.861292 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.861278 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert podName:505044c7-242a-4290-ab95-7e778348f684 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:48.361254215 +0000 UTC m=+34.603594592 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qfncj" (UID: "505044c7-242a-4290-ab95-7e778348f684") : secret "networking-console-plugin-cert" not found Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861315 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-bound-sa-token\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861342 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2f6e9159-5066-4e91-b3cb-ccc1d827eece-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861368 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsxws\" (UniqueName: \"kubernetes.io/projected/2f6e9159-5066-4e91-b3cb-ccc1d827eece-kube-api-access-vsxws\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861414 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5189531f-23d4-4c5f-86e8-91035f932038-installation-pull-secrets\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861440 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5189531f-23d4-4c5f-86e8-91035f932038-registry-certificates\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861469 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cbe11d10-5c49-495a-8459-b9d0af0389ae-config-volume\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861504 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.861347 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xj8zb\" (UniqueName: \"kubernetes.io/projected/c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5-kube-api-access-xj8zb\") pod \"managed-serviceaccount-addon-agent-968b4dfc-2nkj6\" (UID: \"c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.861539 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c877b6db6-s25mx: secret "image-registry-tls" not found Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861566 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/585763e4-02be-466c-b994-4f49efab07bd-tmp\") pod \"klusterlet-addon-workmgr-6897d948bb-skr9l\" (UID: \"585763e4-02be-466c-b994-4f49efab07bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.861616 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls podName:5189531f-23d4-4c5f-86e8-91035f932038 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:48.361591285 +0000 UTC m=+34.603931679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls") pod "image-registry-c877b6db6-s25mx" (UID: "5189531f-23d4-4c5f-86e8-91035f932038") : secret "image-registry-tls" not found Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5189531f-23d4-4c5f-86e8-91035f932038-ca-trust-extracted\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.861713 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861669 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-968b4dfc-2nkj6\" (UID: \"c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" Apr 22 18:42:47.862384 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861721 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgmx9\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-kube-api-access-kgmx9\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.862384 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861752 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2f6e9159-5066-4e91-b3cb-ccc1d827eece-hub\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.862384 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2f6e9159-5066-4e91-b3cb-ccc1d827eece-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.862384 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5189531f-23d4-4c5f-86e8-91035f932038-trusted-ca\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.862384 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/585763e4-02be-466c-b994-4f49efab07bd-klusterlet-config\") pod \"klusterlet-addon-workmgr-6897d948bb-skr9l\" (UID: \"585763e4-02be-466c-b994-4f49efab07bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:47.862384 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/505044c7-242a-4290-ab95-7e778348f684-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:42:47.862384 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/585763e4-02be-466c-b994-4f49efab07bd-tmp\") pod \"klusterlet-addon-workmgr-6897d948bb-skr9l\" (UID: \"585763e4-02be-466c-b994-4f49efab07bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:47.862384 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861915 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cbe11d10-5c49-495a-8459-b9d0af0389ae-tmp-dir\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:47.862384 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.861969 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnjcz\" (UniqueName: \"kubernetes.io/projected/cbe11d10-5c49-495a-8459-b9d0af0389ae-kube-api-access-vnjcz\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:47.863472 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.863445 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5189531f-23d4-4c5f-86e8-91035f932038-trusted-ca\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.864035 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.863724 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5189531f-23d4-4c5f-86e8-91035f932038-registry-certificates\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.864197 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.864172 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2f6e9159-5066-4e91-b3cb-ccc1d827eece-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.864293 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.864266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/505044c7-242a-4290-ab95-7e778348f684-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:42:47.866225 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.866205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2f6e9159-5066-4e91-b3cb-ccc1d827eece-hub\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.866808 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.866783 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2f6e9159-5066-4e91-b3cb-ccc1d827eece-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.866966 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.866924 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-968b4dfc-2nkj6\" (UID: \"c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" Apr 22 18:42:47.867108 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.867089 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5189531f-23d4-4c5f-86e8-91035f932038-installation-pull-secrets\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.867349 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.867323 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5189531f-23d4-4c5f-86e8-91035f932038-image-registry-private-configuration\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.868484 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.868441 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/585763e4-02be-466c-b994-4f49efab07bd-klusterlet-config\") pod \"klusterlet-addon-workmgr-6897d948bb-skr9l\" (UID: \"585763e4-02be-466c-b994-4f49efab07bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:47.871358 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.871339 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-bound-sa-token\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.871564 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.871543 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgmx9\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-kube-api-access-kgmx9\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:47.871944 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.871927 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj8zb\" (UniqueName: \"kubernetes.io/projected/c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5-kube-api-access-xj8zb\") pod \"managed-serviceaccount-addon-agent-968b4dfc-2nkj6\" (UID: \"c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" Apr 22 18:42:47.873030 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.872987 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssnch\" (UniqueName: \"kubernetes.io/projected/585763e4-02be-466c-b994-4f49efab07bd-kube-api-access-ssnch\") pod \"klusterlet-addon-workmgr-6897d948bb-skr9l\" (UID: \"585763e4-02be-466c-b994-4f49efab07bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:47.876400 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.876087 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2f6e9159-5066-4e91-b3cb-ccc1d827eece-ca\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.876502 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.876452 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2f6e9159-5066-4e91-b3cb-ccc1d827eece-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.878450 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.878408 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsxws\" (UniqueName: \"kubernetes.io/projected/2f6e9159-5066-4e91-b3cb-ccc1d827eece-kube-api-access-vsxws\") pod \"cluster-proxy-proxy-agent-5b656788d6-csc4d\" (UID: \"2f6e9159-5066-4e91-b3cb-ccc1d827eece\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:47.962027 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.961997 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" Apr 22 18:42:47.962233 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.962213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cbe11d10-5c49-495a-8459-b9d0af0389ae-tmp-dir\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:47.962274 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.962245 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnjcz\" (UniqueName: \"kubernetes.io/projected/cbe11d10-5c49-495a-8459-b9d0af0389ae-kube-api-access-vnjcz\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:47.962274 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.962266 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:47.962347 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.962295 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2dk5\" (UniqueName: \"kubernetes.io/projected/cd714644-718d-4a14-9b70-5b3aa5980856-kube-api-access-h2dk5\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:42:47.962424 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.962395 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:47.962546 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.962474 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs podName:eaf5856d-69b7-4e87-b07c-f8ea8eed1048 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:19.962456671 +0000 UTC m=+66.204797069 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs") pod "network-metrics-daemon-l6jrz" (UID: "eaf5856d-69b7-4e87-b07c-f8ea8eed1048") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:47.962546 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.962534 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cbe11d10-5c49-495a-8459-b9d0af0389ae-tmp-dir\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:47.962694 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.962629 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:47.962889 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.962705 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:42:47.962889 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.962724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cbe11d10-5c49-495a-8459-b9d0af0389ae-config-volume\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:47.962889 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.962744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:42:47.962889 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.962755 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls podName:cbe11d10-5c49-495a-8459-b9d0af0389ae nodeName:}" failed. No retries permitted until 2026-04-22 18:42:48.462742557 +0000 UTC m=+34.705082936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls") pod "dns-default-gtx4b" (UID: "cbe11d10-5c49-495a-8459-b9d0af0389ae") : secret "dns-default-metrics-tls" not found Apr 22 18:42:47.962889 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.962791 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:42:47.962889 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:47.962828 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert podName:cd714644-718d-4a14-9b70-5b3aa5980856 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:48.46281796 +0000 UTC m=+34.705158338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert") pod "ingress-canary-nmmxl" (UID: "cd714644-718d-4a14-9b70-5b3aa5980856") : secret "canary-serving-cert" not found Apr 22 18:42:47.963201 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.963126 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cbe11d10-5c49-495a-8459-b9d0af0389ae-config-volume\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:47.971445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.971408 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2dk5\" (UniqueName: \"kubernetes.io/projected/cd714644-718d-4a14-9b70-5b3aa5980856-kube-api-access-h2dk5\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:42:47.971676 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.971532 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnjcz\" (UniqueName: \"kubernetes.io/projected/cbe11d10-5c49-495a-8459-b9d0af0389ae-kube-api-access-vnjcz\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:47.974320 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.974299 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:47.992890 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:47.992809 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:42:48.063539 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:48.063416 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdt8v\" (UniqueName: \"kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v\") pod \"network-check-target-mbl6x\" (UID: \"4790d6e1-aad7-43f0-95f6-04dde16468f8\") " pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:48.063711 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.063665 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:48.063711 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.063686 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:48.063711 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.063698 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jdt8v for pod openshift-network-diagnostics/network-check-target-mbl6x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:48.063873 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.063756 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v podName:4790d6e1-aad7-43f0-95f6-04dde16468f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:20.063736995 +0000 UTC m=+66.306077374 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jdt8v" (UniqueName: "kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v") pod "network-check-target-mbl6x" (UID: "4790d6e1-aad7-43f0-95f6-04dde16468f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:48.123776 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:48.123362 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6"] Apr 22 18:42:48.136306 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:48.136280 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l"] Apr 22 18:42:48.151243 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:48.151211 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d"] Apr 22 18:42:48.195906 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:48.195748 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9b66ee2_a75d_4c3b_8c06_d38c27cec0e5.slice/crio-f6e54ffe6e8fa074b15fb89694d5b052087b0e1c8363bf65f1e9042452855e17 WatchSource:0}: Error finding container f6e54ffe6e8fa074b15fb89694d5b052087b0e1c8363bf65f1e9042452855e17: Status 404 returned error can't find the container with id f6e54ffe6e8fa074b15fb89694d5b052087b0e1c8363bf65f1e9042452855e17 Apr 22 18:42:48.196268 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:48.196243 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod585763e4_02be_466c_b994_4f49efab07bd.slice/crio-d10137231c6e98a189c9eb7ca17ce09c145020c6750a00cd52e222c6d70c38aa WatchSource:0}: Error finding container d10137231c6e98a189c9eb7ca17ce09c145020c6750a00cd52e222c6d70c38aa: Status 404 returned error can't find the container with id d10137231c6e98a189c9eb7ca17ce09c145020c6750a00cd52e222c6d70c38aa Apr 22 18:42:48.196983 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:48.196962 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f6e9159_5066_4e91_b3cb_ccc1d827eece.slice/crio-aaca6425c5596bd726215b4b0cff8e2d363039fe4053140de77d188d49c02c73 WatchSource:0}: Error finding container aaca6425c5596bd726215b4b0cff8e2d363039fe4053140de77d188d49c02c73: Status 404 returned error can't find the container with id aaca6425c5596bd726215b4b0cff8e2d363039fe4053140de77d188d49c02c73 Apr 22 18:42:48.365906 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:48.365885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:42:48.366382 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:48.365917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:48.366382 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.366036 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:42:48.366382 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.366086 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert podName:505044c7-242a-4290-ab95-7e778348f684 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:49.366072611 +0000 UTC m=+35.608412988 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qfncj" (UID: "505044c7-242a-4290-ab95-7e778348f684") : secret "networking-console-plugin-cert" not found Apr 22 18:42:48.366382 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.366089 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:42:48.366382 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.366106 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c877b6db6-s25mx: secret "image-registry-tls" not found Apr 22 18:42:48.366382 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.366151 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls podName:5189531f-23d4-4c5f-86e8-91035f932038 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:49.366139393 +0000 UTC m=+35.608479771 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls") pod "image-registry-c877b6db6-s25mx" (UID: "5189531f-23d4-4c5f-86e8-91035f932038") : secret "image-registry-tls" not found Apr 22 18:42:48.449096 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:48.449043 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" event={"ID":"2f6e9159-5066-4e91-b3cb-ccc1d827eece","Type":"ContainerStarted","Data":"aaca6425c5596bd726215b4b0cff8e2d363039fe4053140de77d188d49c02c73"} Apr 22 18:42:48.450089 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:48.450062 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" event={"ID":"585763e4-02be-466c-b994-4f49efab07bd","Type":"ContainerStarted","Data":"d10137231c6e98a189c9eb7ca17ce09c145020c6750a00cd52e222c6d70c38aa"} Apr 22 18:42:48.451168 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:48.451141 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" event={"ID":"c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5","Type":"ContainerStarted","Data":"f6e54ffe6e8fa074b15fb89694d5b052087b0e1c8363bf65f1e9042452855e17"} Apr 22 18:42:48.457887 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:48.457864 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c4hqs" event={"ID":"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68","Type":"ContainerStarted","Data":"dc9cb4aa860509807d7d34a53ba04320f9cad9afe1fa1eec16095901677df1bb"} Apr 22 18:42:48.467063 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:48.467041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:42:48.467168 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:48.467105 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:48.467217 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.467176 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:42:48.467217 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.467175 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:42:48.467290 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.467222 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls podName:cbe11d10-5c49-495a-8459-b9d0af0389ae nodeName:}" failed. No retries permitted until 2026-04-22 18:42:49.467209039 +0000 UTC m=+35.709549415 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls") pod "dns-default-gtx4b" (UID: "cbe11d10-5c49-495a-8459-b9d0af0389ae") : secret "dns-default-metrics-tls" not found Apr 22 18:42:48.467290 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:48.467234 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert podName:cd714644-718d-4a14-9b70-5b3aa5980856 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:49.467228508 +0000 UTC m=+35.709568885 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert") pod "ingress-canary-nmmxl" (UID: "cd714644-718d-4a14-9b70-5b3aa5980856") : secret "canary-serving-cert" not found Apr 22 18:42:49.293858 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.293756 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:42:49.293858 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.293780 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:49.294192 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.294065 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:42:49.297249 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.296949 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:42:49.298788 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.298367 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7gnxr\"" Apr 22 18:42:49.298788 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.298454 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:42:49.298788 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.298367 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:42:49.298788 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.298620 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:42:49.298788 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.298702 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xrqzc\"" Apr 22 18:42:49.374720 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.374689 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:49.375161 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.374782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:42:49.375161 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.374814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:49.375161 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:49.374973 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:42:49.375161 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:49.374988 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c877b6db6-s25mx: secret "image-registry-tls" not found Apr 22 18:42:49.375161 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:49.375048 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls podName:5189531f-23d4-4c5f-86e8-91035f932038 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:51.375028209 +0000 UTC m=+37.617368586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls") pod "image-registry-c877b6db6-s25mx" (UID: "5189531f-23d4-4c5f-86e8-91035f932038") : secret "image-registry-tls" not found Apr 22 18:42:49.375538 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:49.375431 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:42:49.375538 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:49.375496 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert podName:505044c7-242a-4290-ab95-7e778348f684 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:51.375478118 +0000 UTC m=+37.617818512 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qfncj" (UID: "505044c7-242a-4290-ab95-7e778348f684") : secret "networking-console-plugin-cert" not found Apr 22 18:42:49.382023 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.381974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/83ef0e88-f056-4378-862a-0cd9fcd367d1-original-pull-secret\") pod \"global-pull-secret-syncer-tdb8c\" (UID: \"83ef0e88-f056-4378-862a-0cd9fcd367d1\") " pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:49.471154 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.471116 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68" containerID="dc9cb4aa860509807d7d34a53ba04320f9cad9afe1fa1eec16095901677df1bb" exitCode=0 Apr 22 18:42:49.471321 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.471172 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c4hqs" event={"ID":"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68","Type":"ContainerDied","Data":"dc9cb4aa860509807d7d34a53ba04320f9cad9afe1fa1eec16095901677df1bb"} Apr 22 18:42:49.475390 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.475366 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:42:49.475510 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.475493 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:49.475726 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:49.475709 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:42:49.475802 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:49.475772 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls podName:cbe11d10-5c49-495a-8459-b9d0af0389ae nodeName:}" failed. No retries permitted until 2026-04-22 18:42:51.475752388 +0000 UTC m=+37.718092803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls") pod "dns-default-gtx4b" (UID: "cbe11d10-5c49-495a-8459-b9d0af0389ae") : secret "dns-default-metrics-tls" not found Apr 22 18:42:49.476271 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:49.476254 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:42:49.476356 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:49.476303 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert podName:cd714644-718d-4a14-9b70-5b3aa5980856 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:51.476288489 +0000 UTC m=+37.718628874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert") pod "ingress-canary-nmmxl" (UID: "cd714644-718d-4a14-9b70-5b3aa5980856") : secret "canary-serving-cert" not found Apr 22 18:42:49.628528 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.628478 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tdb8c" Apr 22 18:42:49.842248 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:49.841225 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tdb8c"] Apr 22 18:42:49.848213 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:42:49.848184 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83ef0e88_f056_4378_862a_0cd9fcd367d1.slice/crio-a3c1031e75d7d5b11e80a40ea208685882d26a583444cdc3b544ba54cdee567e WatchSource:0}: Error finding container a3c1031e75d7d5b11e80a40ea208685882d26a583444cdc3b544ba54cdee567e: Status 404 returned error can't find the container with id a3c1031e75d7d5b11e80a40ea208685882d26a583444cdc3b544ba54cdee567e Apr 22 18:42:50.476576 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:50.476537 2578 generic.go:358] "Generic (PLEG): container finished" podID="ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68" containerID="d81e2d606d8bbab36432cd8fad68223954df085c19cdee56ef92bc70f1265b30" exitCode=0 Apr 22 18:42:50.477086 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:50.476664 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c4hqs" event={"ID":"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68","Type":"ContainerDied","Data":"d81e2d606d8bbab36432cd8fad68223954df085c19cdee56ef92bc70f1265b30"} Apr 22 18:42:50.478074 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:50.477881 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tdb8c" event={"ID":"83ef0e88-f056-4378-862a-0cd9fcd367d1","Type":"ContainerStarted","Data":"a3c1031e75d7d5b11e80a40ea208685882d26a583444cdc3b544ba54cdee567e"} Apr 22 18:42:51.394633 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:51.394595 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:42:51.394829 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:51.394659 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:51.394829 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:51.394763 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:42:51.394829 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:51.394820 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:42:51.394974 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:51.394835 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c877b6db6-s25mx: secret "image-registry-tls" not found Apr 22 18:42:51.394974 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:51.394843 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert podName:505044c7-242a-4290-ab95-7e778348f684 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:55.394820994 +0000 UTC m=+41.637161385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qfncj" (UID: "505044c7-242a-4290-ab95-7e778348f684") : secret "networking-console-plugin-cert" not found Apr 22 18:42:51.394974 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:51.394882 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls podName:5189531f-23d4-4c5f-86e8-91035f932038 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:55.394866242 +0000 UTC m=+41.637206638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls") pod "image-registry-c877b6db6-s25mx" (UID: "5189531f-23d4-4c5f-86e8-91035f932038") : secret "image-registry-tls" not found Apr 22 18:42:51.495699 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:51.495592 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:51.496099 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:51.495722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:42:51.496099 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:51.495725 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:42:51.496099 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:51.495803 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls podName:cbe11d10-5c49-495a-8459-b9d0af0389ae nodeName:}" failed. No retries permitted until 2026-04-22 18:42:55.495780184 +0000 UTC m=+41.738120563 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls") pod "dns-default-gtx4b" (UID: "cbe11d10-5c49-495a-8459-b9d0af0389ae") : secret "dns-default-metrics-tls" not found Apr 22 18:42:51.496099 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:51.495875 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:42:51.496099 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:51.495934 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert podName:cd714644-718d-4a14-9b70-5b3aa5980856 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:55.495916362 +0000 UTC m=+41.738256745 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert") pod "ingress-canary-nmmxl" (UID: "cd714644-718d-4a14-9b70-5b3aa5980856") : secret "canary-serving-cert" not found Apr 22 18:42:55.427408 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:55.427211 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:42:55.427979 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:55.427421 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:42:55.427979 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:55.427374 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:42:55.427979 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:55.427529 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:42:55.427979 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:55.427545 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c877b6db6-s25mx: secret "image-registry-tls" not found Apr 22 18:42:55.427979 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:55.427577 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert podName:505044c7-242a-4290-ab95-7e778348f684 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:03.427548119 +0000 UTC m=+49.669888506 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qfncj" (UID: "505044c7-242a-4290-ab95-7e778348f684") : secret "networking-console-plugin-cert" not found Apr 22 18:42:55.427979 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:55.427602 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls podName:5189531f-23d4-4c5f-86e8-91035f932038 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:03.427586109 +0000 UTC m=+49.669926490 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls") pod "image-registry-c877b6db6-s25mx" (UID: "5189531f-23d4-4c5f-86e8-91035f932038") : secret "image-registry-tls" not found Apr 22 18:42:55.495029 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:55.494992 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c4hqs" event={"ID":"ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68","Type":"ContainerStarted","Data":"0b86e61b90b01bec84d90b1ccb8f2471c28e4880f22671e50792a057aa405b36"} Apr 22 18:42:55.525157 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:55.525108 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-c4hqs" podStartSLOduration=10.350732687 podStartE2EDuration="41.52509274s" podCreationTimestamp="2026-04-22 18:42:14 +0000 UTC" firstStartedPulling="2026-04-22 18:42:17.063517618 +0000 UTC m=+3.305857999" lastFinishedPulling="2026-04-22 18:42:48.237877672 +0000 UTC m=+34.480218052" observedRunningTime="2026-04-22 18:42:55.524178557 +0000 UTC m=+41.766519006" watchObservedRunningTime="2026-04-22 18:42:55.52509274 +0000 UTC m=+41.767433138" Apr 22 18:42:55.528399 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:55.528373 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:42:55.528534 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:55.528440 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:42:55.528586 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:55.528549 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:42:55.528586 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:55.528551 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:42:55.528690 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:55.528603 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert podName:cd714644-718d-4a14-9b70-5b3aa5980856 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:03.52858838 +0000 UTC m=+49.770928756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert") pod "ingress-canary-nmmxl" (UID: "cd714644-718d-4a14-9b70-5b3aa5980856") : secret "canary-serving-cert" not found Apr 22 18:42:55.528690 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:42:55.528616 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls podName:cbe11d10-5c49-495a-8459-b9d0af0389ae nodeName:}" failed. No retries permitted until 2026-04-22 18:43:03.528610519 +0000 UTC m=+49.770950896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls") pod "dns-default-gtx4b" (UID: "cbe11d10-5c49-495a-8459-b9d0af0389ae") : secret "dns-default-metrics-tls" not found Apr 22 18:42:56.498766 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:56.498293 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" event={"ID":"585763e4-02be-466c-b994-4f49efab07bd","Type":"ContainerStarted","Data":"a6d4d8f37f137d02c365c91d19af11f326fed030fc0a475ffad521a754a55ca5"} Apr 22 18:42:56.498766 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:56.498653 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:56.500092 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:56.500063 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" event={"ID":"2f6e9159-5066-4e91-b3cb-ccc1d827eece","Type":"ContainerStarted","Data":"b0b09d8a2160a04b83835fbdf7af57c56dbf85ceaa8dce30c2d86c45117044bd"} Apr 22 18:42:56.500590 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:56.500572 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:42:56.502278 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:56.502257 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" event={"ID":"c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5","Type":"ContainerStarted","Data":"e9501596f0a9cc8e746549f166765b86151b258aa8c188b02b45fa2d0d9490b6"} Apr 22 18:42:56.571704 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:56.571658 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" podStartSLOduration=15.174904096 podStartE2EDuration="21.571623629s" podCreationTimestamp="2026-04-22 18:42:35 +0000 UTC" firstStartedPulling="2026-04-22 18:42:48.215192065 +0000 UTC m=+34.457532445" lastFinishedPulling="2026-04-22 18:42:54.611911587 +0000 UTC m=+40.854251978" observedRunningTime="2026-04-22 18:42:56.537891052 +0000 UTC m=+42.780231450" watchObservedRunningTime="2026-04-22 18:42:56.571623629 +0000 UTC m=+42.813964010" Apr 22 18:42:56.632307 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:56.632253 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" podStartSLOduration=15.255554556 podStartE2EDuration="21.632239568s" podCreationTimestamp="2026-04-22 18:42:35 +0000 UTC" firstStartedPulling="2026-04-22 18:42:48.215306706 +0000 UTC m=+34.457647086" lastFinishedPulling="2026-04-22 18:42:54.591991715 +0000 UTC m=+40.834332098" observedRunningTime="2026-04-22 18:42:56.631759189 +0000 UTC m=+42.874099588" watchObservedRunningTime="2026-04-22 18:42:56.632239568 +0000 UTC m=+42.874579967" Apr 22 18:42:57.505008 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:57.504974 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tdb8c" event={"ID":"83ef0e88-f056-4378-862a-0cd9fcd367d1","Type":"ContainerStarted","Data":"a25736102e8afbec80a67bb69dfb06d38f84439f9b83f0980081158c1f3bac29"} Apr 22 18:42:57.522705 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:57.522657 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tdb8c" podStartSLOduration=33.894196554 podStartE2EDuration="40.522007677s" podCreationTimestamp="2026-04-22 18:42:17 +0000 UTC" firstStartedPulling="2026-04-22 18:42:49.851448314 +0000 UTC m=+36.093788711" lastFinishedPulling="2026-04-22 18:42:56.479259451 +0000 UTC m=+42.721599834" observedRunningTime="2026-04-22 18:42:57.521502069 +0000 UTC m=+43.763842465" watchObservedRunningTime="2026-04-22 18:42:57.522007677 +0000 UTC m=+43.764348075" Apr 22 18:42:59.510805 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:42:59.510772 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" event={"ID":"2f6e9159-5066-4e91-b3cb-ccc1d827eece","Type":"ContainerStarted","Data":"06dd482333234371e138cc933bfd128813ffb16b86787aeeff27c8e9f5346714"} Apr 22 18:43:00.515472 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:00.515432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" event={"ID":"2f6e9159-5066-4e91-b3cb-ccc1d827eece","Type":"ContainerStarted","Data":"fc59f74fd529c613f7ccdceee2209e84115271b9fd2848fb2c7affefbabfaa50"} Apr 22 18:43:00.538493 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:00.538447 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" podStartSLOduration=14.335259093 podStartE2EDuration="25.538433434s" podCreationTimestamp="2026-04-22 18:42:35 +0000 UTC" firstStartedPulling="2026-04-22 18:42:48.21518341 +0000 UTC m=+34.457523788" lastFinishedPulling="2026-04-22 18:42:59.418357752 +0000 UTC m=+45.660698129" observedRunningTime="2026-04-22 18:43:00.536971909 +0000 UTC m=+46.779312308" watchObservedRunningTime="2026-04-22 18:43:00.538433434 +0000 UTC m=+46.780773833" Apr 22 18:43:03.490860 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:03.490818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:43:03.490860 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:03.490859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:43:03.491322 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:03.490956 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:43:03.491322 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:03.490961 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:43:03.491322 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:03.491036 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert podName:505044c7-242a-4290-ab95-7e778348f684 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:19.491020111 +0000 UTC m=+65.733360509 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qfncj" (UID: "505044c7-242a-4290-ab95-7e778348f684") : secret "networking-console-plugin-cert" not found Apr 22 18:43:03.491322 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:03.490967 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c877b6db6-s25mx: secret "image-registry-tls" not found Apr 22 18:43:03.491322 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:03.491143 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls podName:5189531f-23d4-4c5f-86e8-91035f932038 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:19.491119122 +0000 UTC m=+65.733459500 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls") pod "image-registry-c877b6db6-s25mx" (UID: "5189531f-23d4-4c5f-86e8-91035f932038") : secret "image-registry-tls" not found Apr 22 18:43:03.592220 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:03.592176 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:43:03.592401 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:03.592265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:43:03.592401 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:03.592314 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:03.592401 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:03.592378 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert podName:cd714644-718d-4a14-9b70-5b3aa5980856 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:19.592363646 +0000 UTC m=+65.834704023 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert") pod "ingress-canary-nmmxl" (UID: "cd714644-718d-4a14-9b70-5b3aa5980856") : secret "canary-serving-cert" not found Apr 22 18:43:03.592529 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:03.592403 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:03.592529 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:03.592456 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls podName:cbe11d10-5c49-495a-8459-b9d0af0389ae nodeName:}" failed. No retries permitted until 2026-04-22 18:43:19.592444487 +0000 UTC m=+65.834784864 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls") pod "dns-default-gtx4b" (UID: "cbe11d10-5c49-495a-8459-b9d0af0389ae") : secret "dns-default-metrics-tls" not found Apr 22 18:43:13.454446 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:13.454417 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sr4r9" Apr 22 18:43:19.515827 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:19.515790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:43:19.515827 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:19.515839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:43:19.516302 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:19.515945 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:43:19.516302 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:19.515948 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:43:19.516302 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:19.516029 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c877b6db6-s25mx: secret "image-registry-tls" not found Apr 22 18:43:19.516302 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:19.516032 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert podName:505044c7-242a-4290-ab95-7e778348f684 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:51.516012873 +0000 UTC m=+97.758353250 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qfncj" (UID: "505044c7-242a-4290-ab95-7e778348f684") : secret "networking-console-plugin-cert" not found Apr 22 18:43:19.516302 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:19.516079 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls podName:5189531f-23d4-4c5f-86e8-91035f932038 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:51.516067902 +0000 UTC m=+97.758408284 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls") pod "image-registry-c877b6db6-s25mx" (UID: "5189531f-23d4-4c5f-86e8-91035f932038") : secret "image-registry-tls" not found Apr 22 18:43:19.617010 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:19.616978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:43:19.617183 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:19.617046 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:43:19.617183 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:19.617123 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:19.617183 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:19.617176 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls podName:cbe11d10-5c49-495a-8459-b9d0af0389ae nodeName:}" failed. No retries permitted until 2026-04-22 18:43:51.617161585 +0000 UTC m=+97.859501962 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls") pod "dns-default-gtx4b" (UID: "cbe11d10-5c49-495a-8459-b9d0af0389ae") : secret "dns-default-metrics-tls" not found Apr 22 18:43:19.617335 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:19.617124 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:19.617335 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:19.617263 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert podName:cd714644-718d-4a14-9b70-5b3aa5980856 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:51.617244735 +0000 UTC m=+97.859585127 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert") pod "ingress-canary-nmmxl" (UID: "cd714644-718d-4a14-9b70-5b3aa5980856") : secret "canary-serving-cert" not found Apr 22 18:43:20.020657 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:20.020601 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:43:20.023783 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:20.023754 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:43:20.031476 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:20.031449 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:43:20.031580 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:20.031516 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs podName:eaf5856d-69b7-4e87-b07c-f8ea8eed1048 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:24.031500628 +0000 UTC m=+130.273841009 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs") pod "network-metrics-daemon-l6jrz" (UID: "eaf5856d-69b7-4e87-b07c-f8ea8eed1048") : secret "metrics-daemon-secret" not found Apr 22 18:43:20.121907 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:20.121875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdt8v\" (UniqueName: \"kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v\") pod \"network-check-target-mbl6x\" (UID: \"4790d6e1-aad7-43f0-95f6-04dde16468f8\") " pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:43:20.124841 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:20.124823 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:43:20.135086 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:20.135072 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:43:20.146162 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:20.146145 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdt8v\" (UniqueName: \"kubernetes.io/projected/4790d6e1-aad7-43f0-95f6-04dde16468f8-kube-api-access-jdt8v\") pod \"network-check-target-mbl6x\" (UID: \"4790d6e1-aad7-43f0-95f6-04dde16468f8\") " pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:43:20.215594 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:20.215571 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xrqzc\"" Apr 22 18:43:20.223199 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:20.223182 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:43:20.342730 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:20.339675 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mbl6x"] Apr 22 18:43:20.343578 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:43:20.343522 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4790d6e1_aad7_43f0_95f6_04dde16468f8.slice/crio-baf32bf99dad57aa239db34870e5a8a1fb432350adc8b8e92d7cb93203f050d2 WatchSource:0}: Error finding container baf32bf99dad57aa239db34870e5a8a1fb432350adc8b8e92d7cb93203f050d2: Status 404 returned error can't find the container with id baf32bf99dad57aa239db34870e5a8a1fb432350adc8b8e92d7cb93203f050d2 Apr 22 18:43:20.569145 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:20.569054 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mbl6x" event={"ID":"4790d6e1-aad7-43f0-95f6-04dde16468f8","Type":"ContainerStarted","Data":"baf32bf99dad57aa239db34870e5a8a1fb432350adc8b8e92d7cb93203f050d2"} Apr 22 18:43:24.581107 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:24.581074 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mbl6x" event={"ID":"4790d6e1-aad7-43f0-95f6-04dde16468f8","Type":"ContainerStarted","Data":"3fec3dbeafa7c31f4f076e2104da89f0c69a594450d8eff6efa7b24224aaa9ff"} Apr 22 18:43:24.581550 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:24.581217 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:43:24.598004 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:24.597961 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mbl6x" podStartSLOduration=67.428008435 podStartE2EDuration="1m10.597946168s" podCreationTimestamp="2026-04-22 18:42:14 +0000 UTC" firstStartedPulling="2026-04-22 18:43:20.344896667 +0000 UTC m=+66.587237044" lastFinishedPulling="2026-04-22 18:43:23.514834399 +0000 UTC m=+69.757174777" observedRunningTime="2026-04-22 18:43:24.597441507 +0000 UTC m=+70.839781908" watchObservedRunningTime="2026-04-22 18:43:24.597946168 +0000 UTC m=+70.840286566" Apr 22 18:43:51.560435 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:51.560334 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:43:51.560435 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:51.560388 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:43:51.560898 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:51.560487 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:43:51.560898 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:51.560545 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert podName:505044c7-242a-4290-ab95-7e778348f684 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:55.560530601 +0000 UTC m=+161.802870978 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qfncj" (UID: "505044c7-242a-4290-ab95-7e778348f684") : secret "networking-console-plugin-cert" not found Apr 22 18:43:51.560898 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:51.560489 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:43:51.560898 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:51.560583 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c877b6db6-s25mx: secret "image-registry-tls" not found Apr 22 18:43:51.560898 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:51.560631 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls podName:5189531f-23d4-4c5f-86e8-91035f932038 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:55.560620249 +0000 UTC m=+161.802960641 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls") pod "image-registry-c877b6db6-s25mx" (UID: "5189531f-23d4-4c5f-86e8-91035f932038") : secret "image-registry-tls" not found Apr 22 18:43:51.661212 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:51.661177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:43:51.661369 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:51.661237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:43:51.661369 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:51.661318 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:51.661369 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:51.661318 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:51.661466 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:51.661375 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert podName:cd714644-718d-4a14-9b70-5b3aa5980856 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:55.661360224 +0000 UTC m=+161.903700600 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert") pod "ingress-canary-nmmxl" (UID: "cd714644-718d-4a14-9b70-5b3aa5980856") : secret "canary-serving-cert" not found Apr 22 18:43:51.661466 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:43:51.661389 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls podName:cbe11d10-5c49-495a-8459-b9d0af0389ae nodeName:}" failed. No retries permitted until 2026-04-22 18:44:55.661383069 +0000 UTC m=+161.903723446 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls") pod "dns-default-gtx4b" (UID: "cbe11d10-5c49-495a-8459-b9d0af0389ae") : secret "dns-default-metrics-tls" not found Apr 22 18:43:55.586508 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:43:55.586470 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mbl6x" Apr 22 18:44:24.100886 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:24.100831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:44:24.101453 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:24.100997 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:44:24.101453 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:24.101096 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs podName:eaf5856d-69b7-4e87-b07c-f8ea8eed1048 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:26.101073041 +0000 UTC m=+252.343413435 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs") pod "network-metrics-daemon-l6jrz" (UID: "eaf5856d-69b7-4e87-b07c-f8ea8eed1048") : secret "metrics-daemon-secret" not found Apr 22 18:44:48.314508 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:48.314481 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b5q9v_b4a78812-3844-4d64-b8b5-016856db881b/dns-node-resolver/0.log" Apr 22 18:44:48.914293 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:48.914262 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-67qqz_6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8/node-ca/0.log" Apr 22 18:44:50.613139 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:50.613097 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-c877b6db6-s25mx" podUID="5189531f-23d4-4c5f-86e8-91035f932038" Apr 22 18:44:50.624867 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:50.624837 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" podUID="505044c7-242a-4290-ab95-7e778348f684" Apr 22 18:44:50.736490 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:50.736450 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-gtx4b" podUID="cbe11d10-5c49-495a-8459-b9d0af0389ae" Apr 22 18:44:50.742589 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:50.742563 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-nmmxl" podUID="cd714644-718d-4a14-9b70-5b3aa5980856" Apr 22 18:44:50.777218 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:50.777188 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:44:50.777218 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:50.777213 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:44:50.777366 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:50.777222 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:44:50.777366 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:50.777343 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gtx4b" Apr 22 18:44:52.320949 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:52.320910 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-l6jrz" podUID="eaf5856d-69b7-4e87-b07c-f8ea8eed1048" Apr 22 18:44:55.645868 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:55.645831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:44:55.646407 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:55.645882 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls\") pod \"image-registry-c877b6db6-s25mx\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:44:55.646407 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:55.645970 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:44:55.646407 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:55.645982 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c877b6db6-s25mx: secret "image-registry-tls" not found Apr 22 18:44:55.646407 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:55.645989 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:44:55.646407 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:55.646039 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls podName:5189531f-23d4-4c5f-86e8-91035f932038 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:57.646024541 +0000 UTC m=+283.888364918 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls") pod "image-registry-c877b6db6-s25mx" (UID: "5189531f-23d4-4c5f-86e8-91035f932038") : secret "image-registry-tls" not found Apr 22 18:44:55.646407 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:55.646069 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert podName:505044c7-242a-4290-ab95-7e778348f684 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:57.646050449 +0000 UTC m=+283.888390854 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qfncj" (UID: "505044c7-242a-4290-ab95-7e778348f684") : secret "networking-console-plugin-cert" not found Apr 22 18:44:55.746733 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:55.746690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:44:55.746902 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:55.746760 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:44:55.746902 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:55.746845 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:55.746991 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:55.746906 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls podName:cbe11d10-5c49-495a-8459-b9d0af0389ae nodeName:}" failed. No retries permitted until 2026-04-22 18:46:57.746891568 +0000 UTC m=+283.989231945 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls") pod "dns-default-gtx4b" (UID: "cbe11d10-5c49-495a-8459-b9d0af0389ae") : secret "dns-default-metrics-tls" not found Apr 22 18:44:55.746991 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:55.746852 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:55.746991 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:55.746968 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert podName:cd714644-718d-4a14-9b70-5b3aa5980856 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:57.746957111 +0000 UTC m=+283.989297488 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert") pod "ingress-canary-nmmxl" (UID: "cd714644-718d-4a14-9b70-5b3aa5980856") : secret "canary-serving-cert" not found Apr 22 18:44:56.499703 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:56.499582 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" podUID="585763e4-02be-466c-b994-4f49efab07bd" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/readyz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 22 18:44:56.791497 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:56.791411 2578 generic.go:358] "Generic (PLEG): container finished" podID="c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5" containerID="e9501596f0a9cc8e746549f166765b86151b258aa8c188b02b45fa2d0d9490b6" exitCode=255 Apr 22 18:44:56.791959 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:56.791496 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" event={"ID":"c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5","Type":"ContainerDied","Data":"e9501596f0a9cc8e746549f166765b86151b258aa8c188b02b45fa2d0d9490b6"} Apr 22 18:44:56.791959 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:56.791919 2578 scope.go:117] "RemoveContainer" containerID="e9501596f0a9cc8e746549f166765b86151b258aa8c188b02b45fa2d0d9490b6" Apr 22 18:44:56.792752 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:56.792737 2578 generic.go:358] "Generic (PLEG): container finished" podID="585763e4-02be-466c-b994-4f49efab07bd" containerID="a6d4d8f37f137d02c365c91d19af11f326fed030fc0a475ffad521a754a55ca5" exitCode=1 Apr 22 18:44:56.792812 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:56.792770 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" event={"ID":"585763e4-02be-466c-b994-4f49efab07bd","Type":"ContainerDied","Data":"a6d4d8f37f137d02c365c91d19af11f326fed030fc0a475ffad521a754a55ca5"} Apr 22 18:44:56.793028 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:56.793015 2578 scope.go:117] "RemoveContainer" containerID="a6d4d8f37f137d02c365c91d19af11f326fed030fc0a475ffad521a754a55ca5" Apr 22 18:44:57.796765 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:57.796731 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-968b4dfc-2nkj6" event={"ID":"c9b66ee2-a75d-4c3b-8c06-d38c27cec0e5","Type":"ContainerStarted","Data":"3fc775c4a4f63ca8def8c868bdf528aed2b04d22e5d4c1085813a758c827cce7"} Apr 22 18:44:57.798275 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:57.798251 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" event={"ID":"585763e4-02be-466c-b994-4f49efab07bd","Type":"ContainerStarted","Data":"f9b36f9ff1d950f3b3537dfd05348653fdb052adec42179267c379ca8dd74739"} Apr 22 18:44:57.798516 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:57.798499 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:44:57.799092 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:57.799075 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6897d948bb-skr9l" Apr 22 18:44:59.770104 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.770018 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-d52x2"] Apr 22 18:44:59.773188 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.773167 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.778604 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.778581 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:44:59.778756 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.778720 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:44:59.779810 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.779792 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:44:59.779933 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.779812 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:44:59.779933 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.779837 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-h7vq4\"" Apr 22 18:44:59.789455 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.789432 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d52x2"] Apr 22 18:44:59.878227 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.878185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2528\" (UniqueName: \"kubernetes.io/projected/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-kube-api-access-j2528\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.878420 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.878249 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.878420 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.878333 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-crio-socket\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.878420 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.878377 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.878562 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.878439 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-data-volume\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.979112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.979074 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.979112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.979117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-data-volume\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.979302 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:59.979216 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 18:44:59.979302 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:44:59.979282 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls podName:0e3c2ff2-cc57-4683-bd7f-2538ffeb7788 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:00.479265255 +0000 UTC m=+166.721605633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls") pod "insights-runtime-extractor-d52x2" (UID: "0e3c2ff2-cc57-4683-bd7f-2538ffeb7788") : secret "insights-runtime-extractor-tls" not found Apr 22 18:44:59.979382 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.979301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2528\" (UniqueName: \"kubernetes.io/projected/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-kube-api-access-j2528\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.979382 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.979335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.979382 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.979369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-crio-socket\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.979476 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.979418 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-data-volume\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.979476 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.979443 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-crio-socket\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.979827 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.979810 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:44:59.991835 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:44:59.991810 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2528\" (UniqueName: \"kubernetes.io/projected/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-kube-api-access-j2528\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:45:00.484068 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:00.484034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:45:00.484241 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:45:00.484198 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 18:45:00.484282 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:45:00.484265 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls podName:0e3c2ff2-cc57-4683-bd7f-2538ffeb7788 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:01.484246984 +0000 UTC m=+167.726587362 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls") pod "insights-runtime-extractor-d52x2" (UID: "0e3c2ff2-cc57-4683-bd7f-2538ffeb7788") : secret "insights-runtime-extractor-tls" not found Apr 22 18:45:01.492953 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:01.492896 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:45:01.493379 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:45:01.493036 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 18:45:01.493379 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:45:01.493106 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls podName:0e3c2ff2-cc57-4683-bd7f-2538ffeb7788 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:03.493090103 +0000 UTC m=+169.735430479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls") pod "insights-runtime-extractor-d52x2" (UID: "0e3c2ff2-cc57-4683-bd7f-2538ffeb7788") : secret "insights-runtime-extractor-tls" not found Apr 22 18:45:03.294513 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:03.294480 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:45:03.509983 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:03.509934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:45:03.510148 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:45:03.510001 2578 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 18:45:03.510148 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:45:03.510064 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls podName:0e3c2ff2-cc57-4683-bd7f-2538ffeb7788 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:07.510045915 +0000 UTC m=+173.752386293 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls") pod "insights-runtime-extractor-d52x2" (UID: "0e3c2ff2-cc57-4683-bd7f-2538ffeb7788") : secret "insights-runtime-extractor-tls" not found Apr 22 18:45:07.540649 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:07.540611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:45:07.543085 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:07.543059 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0e3c2ff2-cc57-4683-bd7f-2538ffeb7788-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d52x2\" (UID: \"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788\") " pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:45:07.581416 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:07.581389 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d52x2" Apr 22 18:45:07.704098 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:07.704063 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d52x2"] Apr 22 18:45:07.708285 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:45:07.708250 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e3c2ff2_cc57_4683_bd7f_2538ffeb7788.slice/crio-c040525e31a33d314e7b642caa5aaf0c7e9def6bb12f650b7f45c54bac44051f WatchSource:0}: Error finding container c040525e31a33d314e7b642caa5aaf0c7e9def6bb12f650b7f45c54bac44051f: Status 404 returned error can't find the container with id c040525e31a33d314e7b642caa5aaf0c7e9def6bb12f650b7f45c54bac44051f Apr 22 18:45:07.821450 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:07.821368 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d52x2" event={"ID":"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788","Type":"ContainerStarted","Data":"afe182b2d4759a280b26cae7142426fe3e1646cef70b7ad589143f0b48c0582c"} Apr 22 18:45:07.821450 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:07.821409 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d52x2" event={"ID":"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788","Type":"ContainerStarted","Data":"c040525e31a33d314e7b642caa5aaf0c7e9def6bb12f650b7f45c54bac44051f"} Apr 22 18:45:08.827293 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:08.827216 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d52x2" event={"ID":"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788","Type":"ContainerStarted","Data":"9219a54d5235243f22bb57c600b5c4d5b4f8b179a830507e1d403f9273612073"} Apr 22 18:45:10.834336 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:10.834302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d52x2" event={"ID":"0e3c2ff2-cc57-4683-bd7f-2538ffeb7788","Type":"ContainerStarted","Data":"ce2cf2101779603f3f21619d074ce1a972bcfc586cfb836e56560212d9ece8cd"} Apr 22 18:45:10.853604 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:10.853558 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-d52x2" podStartSLOduration=9.489223467 podStartE2EDuration="11.85354267s" podCreationTimestamp="2026-04-22 18:44:59 +0000 UTC" firstStartedPulling="2026-04-22 18:45:07.762502637 +0000 UTC m=+174.004843013" lastFinishedPulling="2026-04-22 18:45:10.126821837 +0000 UTC m=+176.369162216" observedRunningTime="2026-04-22 18:45:10.851598935 +0000 UTC m=+177.093939335" watchObservedRunningTime="2026-04-22 18:45:10.85354267 +0000 UTC m=+177.095883069" Apr 22 18:45:17.993994 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:17.993954 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" podUID="2f6e9159-5066-4e91-b3cb-ccc1d827eece" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:45:27.994521 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:27.994483 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" podUID="2f6e9159-5066-4e91-b3cb-ccc1d827eece" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:45:28.872149 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.872109 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dmsbg"] Apr 22 18:45:28.875175 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.875150 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:28.878589 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.878569 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:45:28.878886 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.878867 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:45:28.880072 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.880045 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:45:28.880072 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.880068 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:45:28.880226 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.880078 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:45:28.880226 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.880072 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:45:28.885861 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.885842 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mnhfl\"" Apr 22 18:45:28.903128 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.903103 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-wtmp\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:28.903230 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.903136 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-tls\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:28.903230 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.903180 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ce009679-a885-4ba1-a31d-8658c5ba82eb-root\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:28.903335 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.903240 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-textfile\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:28.903335 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.903273 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw9tm\" (UniqueName: \"kubernetes.io/projected/ce009679-a885-4ba1-a31d-8658c5ba82eb-kube-api-access-tw9tm\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:28.903405 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.903360 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-accelerators-collector-config\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:28.903405 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.903391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce009679-a885-4ba1-a31d-8658c5ba82eb-metrics-client-ca\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:28.903532 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.903429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce009679-a885-4ba1-a31d-8658c5ba82eb-sys\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:28.903532 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:28.903455 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.003934 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.003906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce009679-a885-4ba1-a31d-8658c5ba82eb-metrics-client-ca\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004289 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.003985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce009679-a885-4ba1-a31d-8658c5ba82eb-sys\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004289 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004289 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004045 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-wtmp\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004289 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce009679-a885-4ba1-a31d-8658c5ba82eb-sys\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004289 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-tls\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004289 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004181 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ce009679-a885-4ba1-a31d-8658c5ba82eb-root\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004289 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004225 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-textfile\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004289 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004241 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ce009679-a885-4ba1-a31d-8658c5ba82eb-root\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004289 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tw9tm\" (UniqueName: \"kubernetes.io/projected/ce009679-a885-4ba1-a31d-8658c5ba82eb-kube-api-access-tw9tm\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004289 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004181 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-wtmp\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004289 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:45:29.004172 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:45:29.004772 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-accelerators-collector-config\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004772 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:45:29.004346 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-tls podName:ce009679-a885-4ba1-a31d-8658c5ba82eb nodeName:}" failed. No retries permitted until 2026-04-22 18:45:29.504325096 +0000 UTC m=+195.746665477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-tls") pod "node-exporter-dmsbg" (UID: "ce009679-a885-4ba1-a31d-8658c5ba82eb") : secret "node-exporter-tls" not found Apr 22 18:45:29.004772 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004544 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce009679-a885-4ba1-a31d-8658c5ba82eb-metrics-client-ca\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004772 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004545 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-textfile\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.004895 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.004778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-accelerators-collector-config\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.006504 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.006487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.013948 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.013924 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw9tm\" (UniqueName: \"kubernetes.io/projected/ce009679-a885-4ba1-a31d-8658c5ba82eb-kube-api-access-tw9tm\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.507206 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.507173 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-tls\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.509573 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.509550 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ce009679-a885-4ba1-a31d-8658c5ba82eb-node-exporter-tls\") pod \"node-exporter-dmsbg\" (UID: \"ce009679-a885-4ba1-a31d-8658c5ba82eb\") " pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.784285 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.784193 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dmsbg" Apr 22 18:45:29.793041 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:45:29.793004 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce009679_a885_4ba1_a31d_8658c5ba82eb.slice/crio-9290aecd57f666301ba3627f305fe0d7f39ddf2075b809c1ef20fdc2303ca492 WatchSource:0}: Error finding container 9290aecd57f666301ba3627f305fe0d7f39ddf2075b809c1ef20fdc2303ca492: Status 404 returned error can't find the container with id 9290aecd57f666301ba3627f305fe0d7f39ddf2075b809c1ef20fdc2303ca492 Apr 22 18:45:29.881745 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:29.881708 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmsbg" event={"ID":"ce009679-a885-4ba1-a31d-8658c5ba82eb","Type":"ContainerStarted","Data":"9290aecd57f666301ba3627f305fe0d7f39ddf2075b809c1ef20fdc2303ca492"} Apr 22 18:45:30.886656 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:30.886552 2578 generic.go:358] "Generic (PLEG): container finished" podID="ce009679-a885-4ba1-a31d-8658c5ba82eb" containerID="73bc752b300752f0dc2be85dc7ba1500484f6c7dbe1b5aec8a0a6d2927ecea4d" exitCode=0 Apr 22 18:45:30.887093 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:30.886659 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmsbg" event={"ID":"ce009679-a885-4ba1-a31d-8658c5ba82eb","Type":"ContainerDied","Data":"73bc752b300752f0dc2be85dc7ba1500484f6c7dbe1b5aec8a0a6d2927ecea4d"} Apr 22 18:45:31.891325 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:31.891288 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmsbg" event={"ID":"ce009679-a885-4ba1-a31d-8658c5ba82eb","Type":"ContainerStarted","Data":"935c310e3a51ba3683029cd5225afd74aa8d5375e0e74b01dc3bbe5b10f55380"} Apr 22 18:45:31.891731 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:31.891333 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dmsbg" event={"ID":"ce009679-a885-4ba1-a31d-8658c5ba82eb","Type":"ContainerStarted","Data":"fdc8cc4f4472710885738ea9bb75a876b8dbde9b7d2f6940b1fb8ae79b43ba39"} Apr 22 18:45:31.914282 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:31.914227 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dmsbg" podStartSLOduration=3.217434462 podStartE2EDuration="3.914213117s" podCreationTimestamp="2026-04-22 18:45:28 +0000 UTC" firstStartedPulling="2026-04-22 18:45:29.795257206 +0000 UTC m=+196.037597586" lastFinishedPulling="2026-04-22 18:45:30.492035853 +0000 UTC m=+196.734376241" observedRunningTime="2026-04-22 18:45:31.912204797 +0000 UTC m=+198.154545193" watchObservedRunningTime="2026-04-22 18:45:31.914213117 +0000 UTC m=+198.156553493" Apr 22 18:45:37.993905 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:37.993866 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" podUID="2f6e9159-5066-4e91-b3cb-ccc1d827eece" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:45:37.994255 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:37.993943 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" Apr 22 18:45:37.994420 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:37.994389 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"fc59f74fd529c613f7ccdceee2209e84115271b9fd2848fb2c7affefbabfaa50"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 18:45:37.994490 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:37.994475 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" podUID="2f6e9159-5066-4e91-b3cb-ccc1d827eece" containerName="service-proxy" containerID="cri-o://fc59f74fd529c613f7ccdceee2209e84115271b9fd2848fb2c7affefbabfaa50" gracePeriod=30 Apr 22 18:45:38.909300 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:38.909262 2578 generic.go:358] "Generic (PLEG): container finished" podID="2f6e9159-5066-4e91-b3cb-ccc1d827eece" containerID="fc59f74fd529c613f7ccdceee2209e84115271b9fd2848fb2c7affefbabfaa50" exitCode=2 Apr 22 18:45:38.909300 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:38.909304 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" event={"ID":"2f6e9159-5066-4e91-b3cb-ccc1d827eece","Type":"ContainerDied","Data":"fc59f74fd529c613f7ccdceee2209e84115271b9fd2848fb2c7affefbabfaa50"} Apr 22 18:45:38.909501 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:38.909330 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b656788d6-csc4d" event={"ID":"2f6e9159-5066-4e91-b3cb-ccc1d827eece","Type":"ContainerStarted","Data":"809c244f76ce8b4aa7072d62480b3f5f1cab98d0f4a8962bca4ae5ab5ecc6d28"} Apr 22 18:45:52.878259 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.878222 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c877b6db6-s25mx"] Apr 22 18:45:52.878688 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:45:52.878401 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-c877b6db6-s25mx" podUID="5189531f-23d4-4c5f-86e8-91035f932038" Apr 22 18:45:52.944273 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.944239 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:45:52.948270 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.948248 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:45:52.989563 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.989537 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgmx9\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-kube-api-access-kgmx9\") pod \"5189531f-23d4-4c5f-86e8-91035f932038\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " Apr 22 18:45:52.989706 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.989572 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5189531f-23d4-4c5f-86e8-91035f932038-trusted-ca\") pod \"5189531f-23d4-4c5f-86e8-91035f932038\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " Apr 22 18:45:52.989706 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.989613 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5189531f-23d4-4c5f-86e8-91035f932038-ca-trust-extracted\") pod \"5189531f-23d4-4c5f-86e8-91035f932038\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " Apr 22 18:45:52.989706 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.989663 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5189531f-23d4-4c5f-86e8-91035f932038-registry-certificates\") pod \"5189531f-23d4-4c5f-86e8-91035f932038\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " Apr 22 18:45:52.989706 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.989703 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-bound-sa-token\") pod \"5189531f-23d4-4c5f-86e8-91035f932038\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " Apr 22 18:45:52.989911 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.989753 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5189531f-23d4-4c5f-86e8-91035f932038-installation-pull-secrets\") pod \"5189531f-23d4-4c5f-86e8-91035f932038\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " Apr 22 18:45:52.989911 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.989782 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5189531f-23d4-4c5f-86e8-91035f932038-image-registry-private-configuration\") pod \"5189531f-23d4-4c5f-86e8-91035f932038\" (UID: \"5189531f-23d4-4c5f-86e8-91035f932038\") " Apr 22 18:45:52.990011 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.989973 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5189531f-23d4-4c5f-86e8-91035f932038-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5189531f-23d4-4c5f-86e8-91035f932038" (UID: "5189531f-23d4-4c5f-86e8-91035f932038"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:45:52.990066 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.990021 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5189531f-23d4-4c5f-86e8-91035f932038-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5189531f-23d4-4c5f-86e8-91035f932038" (UID: "5189531f-23d4-4c5f-86e8-91035f932038"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:45:52.990222 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.990187 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5189531f-23d4-4c5f-86e8-91035f932038-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5189531f-23d4-4c5f-86e8-91035f932038" (UID: "5189531f-23d4-4c5f-86e8-91035f932038"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:45:52.992216 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.992187 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5189531f-23d4-4c5f-86e8-91035f932038-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5189531f-23d4-4c5f-86e8-91035f932038" (UID: "5189531f-23d4-4c5f-86e8-91035f932038"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:45:52.992312 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.992222 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5189531f-23d4-4c5f-86e8-91035f932038" (UID: "5189531f-23d4-4c5f-86e8-91035f932038"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:45:52.992312 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.992257 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5189531f-23d4-4c5f-86e8-91035f932038-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5189531f-23d4-4c5f-86e8-91035f932038" (UID: "5189531f-23d4-4c5f-86e8-91035f932038"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:45:52.992312 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:52.992287 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-kube-api-access-kgmx9" (OuterVolumeSpecName: "kube-api-access-kgmx9") pod "5189531f-23d4-4c5f-86e8-91035f932038" (UID: "5189531f-23d4-4c5f-86e8-91035f932038"). InnerVolumeSpecName "kube-api-access-kgmx9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:45:53.091304 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:53.091259 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5189531f-23d4-4c5f-86e8-91035f932038-ca-trust-extracted\") on node \"ip-10-0-129-85.ec2.internal\" DevicePath \"\"" Apr 22 18:45:53.091304 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:53.091300 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5189531f-23d4-4c5f-86e8-91035f932038-registry-certificates\") on node \"ip-10-0-129-85.ec2.internal\" DevicePath \"\"" Apr 22 18:45:53.091304 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:53.091311 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-bound-sa-token\") on node \"ip-10-0-129-85.ec2.internal\" DevicePath \"\"" Apr 22 18:45:53.091304 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:53.091321 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5189531f-23d4-4c5f-86e8-91035f932038-installation-pull-secrets\") on node \"ip-10-0-129-85.ec2.internal\" DevicePath \"\"" Apr 22 18:45:53.091528 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:53.091330 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5189531f-23d4-4c5f-86e8-91035f932038-image-registry-private-configuration\") on node \"ip-10-0-129-85.ec2.internal\" DevicePath \"\"" Apr 22 18:45:53.091528 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:53.091340 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kgmx9\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-kube-api-access-kgmx9\") on node \"ip-10-0-129-85.ec2.internal\" DevicePath \"\"" Apr 22 18:45:53.091528 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:53.091349 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5189531f-23d4-4c5f-86e8-91035f932038-trusted-ca\") on node \"ip-10-0-129-85.ec2.internal\" DevicePath \"\"" Apr 22 18:45:53.947023 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:53.946993 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c877b6db6-s25mx" Apr 22 18:45:53.984915 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:53.984886 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c877b6db6-s25mx"] Apr 22 18:45:53.988760 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:53.988736 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-c877b6db6-s25mx"] Apr 22 18:45:54.097436 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:54.097410 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5189531f-23d4-4c5f-86e8-91035f932038-registry-tls\") on node \"ip-10-0-129-85.ec2.internal\" DevicePath \"\"" Apr 22 18:45:54.296790 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:54.296705 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5189531f-23d4-4c5f-86e8-91035f932038" path="/var/lib/kubelet/pods/5189531f-23d4-4c5f-86e8-91035f932038/volumes" Apr 22 18:45:58.313365 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:45:58.313332 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b5q9v_b4a78812-3844-4d64-b8b5-016856db881b/dns-node-resolver/0.log" Apr 22 18:46:26.141177 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:26.141121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:46:26.143512 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:26.143492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eaf5856d-69b7-4e87-b07c-f8ea8eed1048-metrics-certs\") pod \"network-metrics-daemon-l6jrz\" (UID: \"eaf5856d-69b7-4e87-b07c-f8ea8eed1048\") " pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:46:26.399768 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:26.399686 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7gnxr\"" Apr 22 18:46:26.406648 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:26.406617 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l6jrz" Apr 22 18:46:26.529482 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:26.529458 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l6jrz"] Apr 22 18:46:26.531841 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:46:26.531807 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf5856d_69b7_4e87_b07c_f8ea8eed1048.slice/crio-8f5549ee6f172eb3996d064f4ed2895acf7a61d02a7a842b449a3490fa1f32b5 WatchSource:0}: Error finding container 8f5549ee6f172eb3996d064f4ed2895acf7a61d02a7a842b449a3490fa1f32b5: Status 404 returned error can't find the container with id 8f5549ee6f172eb3996d064f4ed2895acf7a61d02a7a842b449a3490fa1f32b5 Apr 22 18:46:27.029530 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:27.029491 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l6jrz" event={"ID":"eaf5856d-69b7-4e87-b07c-f8ea8eed1048","Type":"ContainerStarted","Data":"8f5549ee6f172eb3996d064f4ed2895acf7a61d02a7a842b449a3490fa1f32b5"} Apr 22 18:46:28.033693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:28.033655 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l6jrz" event={"ID":"eaf5856d-69b7-4e87-b07c-f8ea8eed1048","Type":"ContainerStarted","Data":"3d2f7ebf30bd73a74926caae4678cf03bbff7b5a5a135acf3f011397e764f3b3"} Apr 22 18:46:28.033693 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:28.033695 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l6jrz" event={"ID":"eaf5856d-69b7-4e87-b07c-f8ea8eed1048","Type":"ContainerStarted","Data":"c5ad0cebda7f08f9f85900edce7679476d8266f97e95059d1a94fcc9c128cb45"} Apr 22 18:46:28.053131 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:28.053081 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l6jrz" podStartSLOduration=253.12874558 podStartE2EDuration="4m14.05306682s" podCreationTimestamp="2026-04-22 18:42:14 +0000 UTC" firstStartedPulling="2026-04-22 18:46:26.533540397 +0000 UTC m=+252.775880774" lastFinishedPulling="2026-04-22 18:46:27.457861635 +0000 UTC m=+253.700202014" observedRunningTime="2026-04-22 18:46:28.051964631 +0000 UTC m=+254.294305030" watchObservedRunningTime="2026-04-22 18:46:28.05306682 +0000 UTC m=+254.295407272" Apr 22 18:46:53.778575 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:46:53.778530 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-nmmxl" podUID="cd714644-718d-4a14-9b70-5b3aa5980856" Apr 22 18:46:53.778575 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:46:53.778546 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" podUID="505044c7-242a-4290-ab95-7e778348f684" Apr 22 18:46:53.778575 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:46:53.778546 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-gtx4b" podUID="cbe11d10-5c49-495a-8459-b9d0af0389ae" Apr 22 18:46:54.100397 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:54.100321 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:46:54.100397 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:54.100343 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gtx4b" Apr 22 18:46:54.100397 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:54.100321 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:46:57.668025 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:57.667987 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:46:57.670510 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:57.670478 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/505044c7-242a-4290-ab95-7e778348f684-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qfncj\" (UID: \"505044c7-242a-4290-ab95-7e778348f684\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:46:57.707823 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:57.707802 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-fch7b\"" Apr 22 18:46:57.711871 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:57.711857 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" Apr 22 18:46:57.769038 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:57.769008 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:46:57.769200 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:57.769065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:46:57.772151 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:57.772094 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbe11d10-5c49-495a-8459-b9d0af0389ae-metrics-tls\") pod \"dns-default-gtx4b\" (UID: \"cbe11d10-5c49-495a-8459-b9d0af0389ae\") " pod="openshift-dns/dns-default-gtx4b" Apr 22 18:46:57.772151 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:57.772112 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd714644-718d-4a14-9b70-5b3aa5980856-cert\") pod \"ingress-canary-nmmxl\" (UID: \"cd714644-718d-4a14-9b70-5b3aa5980856\") " pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:46:57.831950 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:57.831918 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qfncj"] Apr 22 18:46:57.835378 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:46:57.835353 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod505044c7_242a_4290_ab95_7e778348f684.slice/crio-27aacead3246ca76a1adf5c85cc60f94678b52455a74fe77786ff84a96fd886e WatchSource:0}: Error finding container 27aacead3246ca76a1adf5c85cc60f94678b52455a74fe77786ff84a96fd886e: Status 404 returned error can't find the container with id 27aacead3246ca76a1adf5c85cc60f94678b52455a74fe77786ff84a96fd886e Apr 22 18:46:58.004105 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:58.004075 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5kx7g\"" Apr 22 18:46:58.004280 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:58.004209 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vrbrj\"" Apr 22 18:46:58.011790 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:58.011764 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gtx4b" Apr 22 18:46:58.011908 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:58.011833 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nmmxl" Apr 22 18:46:58.111546 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:58.111513 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" event={"ID":"505044c7-242a-4290-ab95-7e778348f684","Type":"ContainerStarted","Data":"27aacead3246ca76a1adf5c85cc60f94678b52455a74fe77786ff84a96fd886e"} Apr 22 18:46:58.147278 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:58.147256 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nmmxl"] Apr 22 18:46:58.149485 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:46:58.149454 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd714644_718d_4a14_9b70_5b3aa5980856.slice/crio-9ff756bb78c8d9c7f961a5ef9e9770fafb658693b9e1db4990d53a8211e54fc2 WatchSource:0}: Error finding container 9ff756bb78c8d9c7f961a5ef9e9770fafb658693b9e1db4990d53a8211e54fc2: Status 404 returned error can't find the container with id 9ff756bb78c8d9c7f961a5ef9e9770fafb658693b9e1db4990d53a8211e54fc2 Apr 22 18:46:58.164246 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:58.164224 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gtx4b"] Apr 22 18:46:58.166965 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:46:58.166941 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbe11d10_5c49_495a_8459_b9d0af0389ae.slice/crio-d10d3dfc3d764f217d22bb21bff7bd46afc15a0f7e5dc1f838e5248477b2dd15 WatchSource:0}: Error finding container d10d3dfc3d764f217d22bb21bff7bd46afc15a0f7e5dc1f838e5248477b2dd15: Status 404 returned error can't find the container with id d10d3dfc3d764f217d22bb21bff7bd46afc15a0f7e5dc1f838e5248477b2dd15 Apr 22 18:46:59.116515 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:59.116407 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" event={"ID":"505044c7-242a-4290-ab95-7e778348f684","Type":"ContainerStarted","Data":"4080a6e7126ca73c8b121430468ad8305cff4e291fe8cf4d8ea44e4befcda1e9"} Apr 22 18:46:59.117709 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:59.117665 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gtx4b" event={"ID":"cbe11d10-5c49-495a-8459-b9d0af0389ae","Type":"ContainerStarted","Data":"d10d3dfc3d764f217d22bb21bff7bd46afc15a0f7e5dc1f838e5248477b2dd15"} Apr 22 18:46:59.118773 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:59.118742 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nmmxl" event={"ID":"cd714644-718d-4a14-9b70-5b3aa5980856","Type":"ContainerStarted","Data":"9ff756bb78c8d9c7f961a5ef9e9770fafb658693b9e1db4990d53a8211e54fc2"} Apr 22 18:46:59.136999 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:46:59.136950 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qfncj" podStartSLOduration=262.217094082 podStartE2EDuration="4m23.136938069s" podCreationTimestamp="2026-04-22 18:42:36 +0000 UTC" firstStartedPulling="2026-04-22 18:46:57.837120381 +0000 UTC m=+284.079460758" lastFinishedPulling="2026-04-22 18:46:58.756964369 +0000 UTC m=+284.999304745" observedRunningTime="2026-04-22 18:46:59.13535604 +0000 UTC m=+285.377696446" watchObservedRunningTime="2026-04-22 18:46:59.136938069 +0000 UTC m=+285.379278468" Apr 22 18:47:01.128187 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:47:01.128153 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gtx4b" event={"ID":"cbe11d10-5c49-495a-8459-b9d0af0389ae","Type":"ContainerStarted","Data":"804b717ecc40af93a935b35ae8a68f25a3673f6adaa76524a898d0d264ff7324"} Apr 22 18:47:01.128187 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:47:01.128192 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gtx4b" event={"ID":"cbe11d10-5c49-495a-8459-b9d0af0389ae","Type":"ContainerStarted","Data":"50beec83a62354bdd43e000686c05c3d1e1666fafbd68a786c1f1faa9cb8a095"} Apr 22 18:47:01.128685 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:47:01.128252 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gtx4b" Apr 22 18:47:01.129561 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:47:01.129543 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nmmxl" event={"ID":"cd714644-718d-4a14-9b70-5b3aa5980856","Type":"ContainerStarted","Data":"0d65d448232fec5f77d555a8875d57ac8ce29352091b11267ab930b7e50788d2"} Apr 22 18:47:01.148432 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:47:01.148379 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gtx4b" podStartSLOduration=251.994477588 podStartE2EDuration="4m14.14836774s" podCreationTimestamp="2026-04-22 18:42:47 +0000 UTC" firstStartedPulling="2026-04-22 18:46:58.168664222 +0000 UTC m=+284.411004599" lastFinishedPulling="2026-04-22 18:47:00.322554363 +0000 UTC m=+286.564894751" observedRunningTime="2026-04-22 18:47:01.147477021 +0000 UTC m=+287.389817419" watchObservedRunningTime="2026-04-22 18:47:01.14836774 +0000 UTC m=+287.390708138" Apr 22 18:47:01.165103 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:47:01.165054 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nmmxl" podStartSLOduration=251.989807587 podStartE2EDuration="4m14.165044462s" podCreationTimestamp="2026-04-22 18:42:47 +0000 UTC" firstStartedPulling="2026-04-22 18:46:58.151254607 +0000 UTC m=+284.393594985" lastFinishedPulling="2026-04-22 18:47:00.32649148 +0000 UTC m=+286.568831860" observedRunningTime="2026-04-22 18:47:01.164372644 +0000 UTC m=+287.406713045" watchObservedRunningTime="2026-04-22 18:47:01.165044462 +0000 UTC m=+287.407384860" Apr 22 18:47:11.135879 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:47:11.135850 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gtx4b" Apr 22 18:47:14.208898 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:47:14.208864 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 18:47:14.209324 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:47:14.208939 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 18:47:14.215068 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:47:14.215050 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:51:31.273749 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.273711 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888"] Apr 22 18:51:31.276729 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.276705 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888" Apr 22 18:51:31.279371 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.279348 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-2rvx2\"" Apr 22 18:51:31.280671 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.280617 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:51:31.280671 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.280617 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:51:31.291702 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.291681 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888"] Apr 22 18:51:31.317749 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.317723 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20c3eb36-cee9-45e7-802a-8362efab2dbd-tmp\") pod \"openshift-lws-operator-bfc7f696d-wv888\" (UID: \"20c3eb36-cee9-45e7-802a-8362efab2dbd\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888" Apr 22 18:51:31.317853 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.317756 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn64w\" (UniqueName: \"kubernetes.io/projected/20c3eb36-cee9-45e7-802a-8362efab2dbd-kube-api-access-xn64w\") pod \"openshift-lws-operator-bfc7f696d-wv888\" (UID: \"20c3eb36-cee9-45e7-802a-8362efab2dbd\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888" Apr 22 18:51:31.418927 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.418895 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20c3eb36-cee9-45e7-802a-8362efab2dbd-tmp\") pod \"openshift-lws-operator-bfc7f696d-wv888\" (UID: \"20c3eb36-cee9-45e7-802a-8362efab2dbd\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888" Apr 22 18:51:31.419095 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.418938 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn64w\" (UniqueName: \"kubernetes.io/projected/20c3eb36-cee9-45e7-802a-8362efab2dbd-kube-api-access-xn64w\") pod \"openshift-lws-operator-bfc7f696d-wv888\" (UID: \"20c3eb36-cee9-45e7-802a-8362efab2dbd\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888" Apr 22 18:51:31.419346 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.419326 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20c3eb36-cee9-45e7-802a-8362efab2dbd-tmp\") pod \"openshift-lws-operator-bfc7f696d-wv888\" (UID: \"20c3eb36-cee9-45e7-802a-8362efab2dbd\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888" Apr 22 18:51:31.431123 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.431105 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn64w\" (UniqueName: \"kubernetes.io/projected/20c3eb36-cee9-45e7-802a-8362efab2dbd-kube-api-access-xn64w\") pod \"openshift-lws-operator-bfc7f696d-wv888\" (UID: \"20c3eb36-cee9-45e7-802a-8362efab2dbd\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888" Apr 22 18:51:31.585678 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.585545 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888" Apr 22 18:51:31.706099 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.706072 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888"] Apr 22 18:51:31.710865 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:51:31.710812 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20c3eb36_cee9_45e7_802a_8362efab2dbd.slice/crio-e13acd8bc38b6fb10bd8f68bf340226acc4fb0fd006dc356f79176f8e80c4040 WatchSource:0}: Error finding container e13acd8bc38b6fb10bd8f68bf340226acc4fb0fd006dc356f79176f8e80c4040: Status 404 returned error can't find the container with id e13acd8bc38b6fb10bd8f68bf340226acc4fb0fd006dc356f79176f8e80c4040 Apr 22 18:51:31.711737 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.711718 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:51:31.811130 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:31.811092 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888" event={"ID":"20c3eb36-cee9-45e7-802a-8362efab2dbd","Type":"ContainerStarted","Data":"e13acd8bc38b6fb10bd8f68bf340226acc4fb0fd006dc356f79176f8e80c4040"} Apr 22 18:51:34.820097 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:34.820053 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888" event={"ID":"20c3eb36-cee9-45e7-802a-8362efab2dbd","Type":"ContainerStarted","Data":"f56519c67f1c7bbd46ad5297a5a588668fcf391d39bc441a5df92da268f521be"} Apr 22 18:51:34.841544 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:34.841495 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wv888" podStartSLOduration=1.217089086 podStartE2EDuration="3.841481458s" podCreationTimestamp="2026-04-22 18:51:31 +0000 UTC" firstStartedPulling="2026-04-22 18:51:31.711876199 +0000 UTC m=+557.954216575" lastFinishedPulling="2026-04-22 18:51:34.33626857 +0000 UTC m=+560.578608947" observedRunningTime="2026-04-22 18:51:34.839077329 +0000 UTC m=+561.081417728" watchObservedRunningTime="2026-04-22 18:51:34.841481458 +0000 UTC m=+561.083821857" Apr 22 18:51:50.649650 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.649592 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh"] Apr 22 18:51:50.652875 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.652859 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:51:50.656104 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.656083 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 18:51:50.656217 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.656130 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 18:51:50.656217 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.656202 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 18:51:50.656332 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.656278 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-srzjc\"" Apr 22 18:51:50.656469 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.656450 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 18:51:50.674428 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.674408 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh"] Apr 22 18:51:50.749379 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.749346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb6rx\" (UniqueName: \"kubernetes.io/projected/3c9f2586-5824-4e5d-ad14-46003a8242ac-kube-api-access-gb6rx\") pod \"opendatahub-operator-controller-manager-dd89cc56c-5spxh\" (UID: \"3c9f2586-5824-4e5d-ad14-46003a8242ac\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:51:50.749526 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.749399 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c9f2586-5824-4e5d-ad14-46003a8242ac-webhook-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-5spxh\" (UID: \"3c9f2586-5824-4e5d-ad14-46003a8242ac\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:51:50.749526 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.749418 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c9f2586-5824-4e5d-ad14-46003a8242ac-apiservice-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-5spxh\" (UID: \"3c9f2586-5824-4e5d-ad14-46003a8242ac\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:51:50.850317 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.850287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gb6rx\" (UniqueName: \"kubernetes.io/projected/3c9f2586-5824-4e5d-ad14-46003a8242ac-kube-api-access-gb6rx\") pod \"opendatahub-operator-controller-manager-dd89cc56c-5spxh\" (UID: \"3c9f2586-5824-4e5d-ad14-46003a8242ac\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:51:50.850418 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.850342 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c9f2586-5824-4e5d-ad14-46003a8242ac-webhook-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-5spxh\" (UID: \"3c9f2586-5824-4e5d-ad14-46003a8242ac\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:51:50.850418 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.850361 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c9f2586-5824-4e5d-ad14-46003a8242ac-apiservice-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-5spxh\" (UID: \"3c9f2586-5824-4e5d-ad14-46003a8242ac\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:51:50.852948 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.852925 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c9f2586-5824-4e5d-ad14-46003a8242ac-apiservice-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-5spxh\" (UID: \"3c9f2586-5824-4e5d-ad14-46003a8242ac\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:51:50.853046 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.852952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c9f2586-5824-4e5d-ad14-46003a8242ac-webhook-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-5spxh\" (UID: \"3c9f2586-5824-4e5d-ad14-46003a8242ac\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:51:50.863518 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.863487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb6rx\" (UniqueName: \"kubernetes.io/projected/3c9f2586-5824-4e5d-ad14-46003a8242ac-kube-api-access-gb6rx\") pod \"opendatahub-operator-controller-manager-dd89cc56c-5spxh\" (UID: \"3c9f2586-5824-4e5d-ad14-46003a8242ac\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:51:50.962004 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:50.961971 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:51:51.082633 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:51.082590 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh"] Apr 22 18:51:51.087020 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:51:51.086990 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c9f2586_5824_4e5d_ad14_46003a8242ac.slice/crio-c703eb30d0ac1edc86754cfc939f1b28b82fa3a920d45a11968c47bd0ac6099f WatchSource:0}: Error finding container c703eb30d0ac1edc86754cfc939f1b28b82fa3a920d45a11968c47bd0ac6099f: Status 404 returned error can't find the container with id c703eb30d0ac1edc86754cfc939f1b28b82fa3a920d45a11968c47bd0ac6099f Apr 22 18:51:51.866926 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:51.866885 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" event={"ID":"3c9f2586-5824-4e5d-ad14-46003a8242ac","Type":"ContainerStarted","Data":"c703eb30d0ac1edc86754cfc939f1b28b82fa3a920d45a11968c47bd0ac6099f"} Apr 22 18:51:53.876893 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:53.876790 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" event={"ID":"3c9f2586-5824-4e5d-ad14-46003a8242ac","Type":"ContainerStarted","Data":"96c4bf899fcc71bd71ad7b067d2e5292f9192df6d72db9b409c18c14bdf3c230"} Apr 22 18:51:53.877274 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:53.876908 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:51:53.930826 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:51:53.930676 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" podStartSLOduration=1.517825836 podStartE2EDuration="3.930656101s" podCreationTimestamp="2026-04-22 18:51:50 +0000 UTC" firstStartedPulling="2026-04-22 18:51:51.088774864 +0000 UTC m=+577.331115240" lastFinishedPulling="2026-04-22 18:51:53.501605126 +0000 UTC m=+579.743945505" observedRunningTime="2026-04-22 18:51:53.930090886 +0000 UTC m=+580.172431284" watchObservedRunningTime="2026-04-22 18:51:53.930656101 +0000 UTC m=+580.172996496" Apr 22 18:52:04.885265 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:04.885228 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-5spxh" Apr 22 18:52:09.071039 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.071004 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7cff94f675-n926r"] Apr 22 18:52:09.077944 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.077914 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" Apr 22 18:52:09.083983 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.083938 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 18:52:09.084265 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.084211 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-dthq8\"" Apr 22 18:52:09.084265 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.084227 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:52:09.086074 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.085888 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:52:09.086074 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.085960 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 18:52:09.086838 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.086814 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7cff94f675-n926r"] Apr 22 18:52:09.179416 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.179378 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/411946d3-1e28-4c55-85a3-640dbbfbed40-tls-certs\") pod \"kube-auth-proxy-7cff94f675-n926r\" (UID: \"411946d3-1e28-4c55-85a3-640dbbfbed40\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" Apr 22 18:52:09.179704 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.179684 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/411946d3-1e28-4c55-85a3-640dbbfbed40-tmp\") pod \"kube-auth-proxy-7cff94f675-n926r\" (UID: \"411946d3-1e28-4c55-85a3-640dbbfbed40\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" Apr 22 18:52:09.179855 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.179839 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv58h\" (UniqueName: \"kubernetes.io/projected/411946d3-1e28-4c55-85a3-640dbbfbed40-kube-api-access-dv58h\") pod \"kube-auth-proxy-7cff94f675-n926r\" (UID: \"411946d3-1e28-4c55-85a3-640dbbfbed40\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" Apr 22 18:52:09.280807 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.280757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dv58h\" (UniqueName: \"kubernetes.io/projected/411946d3-1e28-4c55-85a3-640dbbfbed40-kube-api-access-dv58h\") pod \"kube-auth-proxy-7cff94f675-n926r\" (UID: \"411946d3-1e28-4c55-85a3-640dbbfbed40\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" Apr 22 18:52:09.280978 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.280886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/411946d3-1e28-4c55-85a3-640dbbfbed40-tls-certs\") pod \"kube-auth-proxy-7cff94f675-n926r\" (UID: \"411946d3-1e28-4c55-85a3-640dbbfbed40\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" Apr 22 18:52:09.280978 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.280917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/411946d3-1e28-4c55-85a3-640dbbfbed40-tmp\") pod \"kube-auth-proxy-7cff94f675-n926r\" (UID: \"411946d3-1e28-4c55-85a3-640dbbfbed40\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" Apr 22 18:52:09.283433 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.283406 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/411946d3-1e28-4c55-85a3-640dbbfbed40-tmp\") pod \"kube-auth-proxy-7cff94f675-n926r\" (UID: \"411946d3-1e28-4c55-85a3-640dbbfbed40\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" Apr 22 18:52:09.283733 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.283710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/411946d3-1e28-4c55-85a3-640dbbfbed40-tls-certs\") pod \"kube-auth-proxy-7cff94f675-n926r\" (UID: \"411946d3-1e28-4c55-85a3-640dbbfbed40\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" Apr 22 18:52:09.291968 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.291941 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv58h\" (UniqueName: \"kubernetes.io/projected/411946d3-1e28-4c55-85a3-640dbbfbed40-kube-api-access-dv58h\") pod \"kube-auth-proxy-7cff94f675-n926r\" (UID: \"411946d3-1e28-4c55-85a3-640dbbfbed40\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" Apr 22 18:52:09.389239 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.389166 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" Apr 22 18:52:09.536145 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.536122 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7cff94f675-n926r"] Apr 22 18:52:09.538631 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:52:09.538601 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod411946d3_1e28_4c55_85a3_640dbbfbed40.slice/crio-fd5a3663641c795b55f26dfc6414da2970a3ff07a2b63d1d1a06e1a339c9fcd3 WatchSource:0}: Error finding container fd5a3663641c795b55f26dfc6414da2970a3ff07a2b63d1d1a06e1a339c9fcd3: Status 404 returned error can't find the container with id fd5a3663641c795b55f26dfc6414da2970a3ff07a2b63d1d1a06e1a339c9fcd3 Apr 22 18:52:09.923843 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:09.923808 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" event={"ID":"411946d3-1e28-4c55-85a3-640dbbfbed40","Type":"ContainerStarted","Data":"fd5a3663641c795b55f26dfc6414da2970a3ff07a2b63d1d1a06e1a339c9fcd3"} Apr 22 18:52:12.219960 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.219922 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-n2r78"] Apr 22 18:52:12.223248 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.223226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:12.226313 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.226282 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-k9nl8\"" Apr 22 18:52:12.226735 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.226619 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 22 18:52:12.232069 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.232046 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-n2r78"] Apr 22 18:52:12.310268 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.310238 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bdd73ee-82ea-4bf9-9629-952f519692d3-cert\") pod \"odh-model-controller-858dbf95b8-n2r78\" (UID: \"1bdd73ee-82ea-4bf9-9629-952f519692d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:12.310445 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.310286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q2fq\" (UniqueName: \"kubernetes.io/projected/1bdd73ee-82ea-4bf9-9629-952f519692d3-kube-api-access-6q2fq\") pod \"odh-model-controller-858dbf95b8-n2r78\" (UID: \"1bdd73ee-82ea-4bf9-9629-952f519692d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:12.411486 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.411447 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bdd73ee-82ea-4bf9-9629-952f519692d3-cert\") pod \"odh-model-controller-858dbf95b8-n2r78\" (UID: \"1bdd73ee-82ea-4bf9-9629-952f519692d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:12.411703 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.411503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q2fq\" (UniqueName: \"kubernetes.io/projected/1bdd73ee-82ea-4bf9-9629-952f519692d3-kube-api-access-6q2fq\") pod \"odh-model-controller-858dbf95b8-n2r78\" (UID: \"1bdd73ee-82ea-4bf9-9629-952f519692d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:12.411703 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:52:12.411614 2578 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 18:52:12.411703 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:52:12.411694 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bdd73ee-82ea-4bf9-9629-952f519692d3-cert podName:1bdd73ee-82ea-4bf9-9629-952f519692d3 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:12.911676369 +0000 UTC m=+599.154016746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bdd73ee-82ea-4bf9-9629-952f519692d3-cert") pod "odh-model-controller-858dbf95b8-n2r78" (UID: "1bdd73ee-82ea-4bf9-9629-952f519692d3") : secret "odh-model-controller-webhook-cert" not found Apr 22 18:52:12.435966 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.435939 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q2fq\" (UniqueName: \"kubernetes.io/projected/1bdd73ee-82ea-4bf9-9629-952f519692d3-kube-api-access-6q2fq\") pod \"odh-model-controller-858dbf95b8-n2r78\" (UID: \"1bdd73ee-82ea-4bf9-9629-952f519692d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:12.916201 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.916107 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bdd73ee-82ea-4bf9-9629-952f519692d3-cert\") pod \"odh-model-controller-858dbf95b8-n2r78\" (UID: \"1bdd73ee-82ea-4bf9-9629-952f519692d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:12.918601 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.918579 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bdd73ee-82ea-4bf9-9629-952f519692d3-cert\") pod \"odh-model-controller-858dbf95b8-n2r78\" (UID: \"1bdd73ee-82ea-4bf9-9629-952f519692d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:12.934733 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.934694 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" event={"ID":"411946d3-1e28-4c55-85a3-640dbbfbed40","Type":"ContainerStarted","Data":"674666906f114f579bc7b48afaaea3c4441651201819bf8512c0be8f78252b23"} Apr 22 18:52:12.965989 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:12.965933 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7cff94f675-n926r" podStartSLOduration=1.01312182 podStartE2EDuration="3.965917075s" podCreationTimestamp="2026-04-22 18:52:09 +0000 UTC" firstStartedPulling="2026-04-22 18:52:09.540221398 +0000 UTC m=+595.782561775" lastFinishedPulling="2026-04-22 18:52:12.493016649 +0000 UTC m=+598.735357030" observedRunningTime="2026-04-22 18:52:12.964552974 +0000 UTC m=+599.206893367" watchObservedRunningTime="2026-04-22 18:52:12.965917075 +0000 UTC m=+599.208257474" Apr 22 18:52:13.136509 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:13.136478 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:13.262230 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:13.262197 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-n2r78"] Apr 22 18:52:13.264849 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:52:13.264822 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bdd73ee_82ea_4bf9_9629_952f519692d3.slice/crio-9a67d15c36ddcbc350ee3e95af354f44e7c89845ab11e117da1175fd6624326d WatchSource:0}: Error finding container 9a67d15c36ddcbc350ee3e95af354f44e7c89845ab11e117da1175fd6624326d: Status 404 returned error can't find the container with id 9a67d15c36ddcbc350ee3e95af354f44e7c89845ab11e117da1175fd6624326d Apr 22 18:52:13.941243 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:13.941199 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" event={"ID":"1bdd73ee-82ea-4bf9-9629-952f519692d3","Type":"ContainerStarted","Data":"9a67d15c36ddcbc350ee3e95af354f44e7c89845ab11e117da1175fd6624326d"} Apr 22 18:52:14.238389 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:14.238311 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 18:52:14.240214 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:14.240185 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 18:52:16.949811 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:16.949777 2578 generic.go:358] "Generic (PLEG): container finished" podID="1bdd73ee-82ea-4bf9-9629-952f519692d3" containerID="d13e5a9c16bb7f11bb4fc3fc8da2182089f3274ca72c1cb6d28f44a9ac99e7d8" exitCode=1 Apr 22 18:52:16.950309 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:16.949828 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" event={"ID":"1bdd73ee-82ea-4bf9-9629-952f519692d3","Type":"ContainerDied","Data":"d13e5a9c16bb7f11bb4fc3fc8da2182089f3274ca72c1cb6d28f44a9ac99e7d8"} Apr 22 18:52:16.950309 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:16.950131 2578 scope.go:117] "RemoveContainer" containerID="d13e5a9c16bb7f11bb4fc3fc8da2182089f3274ca72c1cb6d28f44a9ac99e7d8" Apr 22 18:52:17.954189 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:17.954152 2578 generic.go:358] "Generic (PLEG): container finished" podID="1bdd73ee-82ea-4bf9-9629-952f519692d3" containerID="308c225f467317d5840b6b396d5856744a9e25503943a59b1794a9c218812cd1" exitCode=1 Apr 22 18:52:17.954624 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:17.954238 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" event={"ID":"1bdd73ee-82ea-4bf9-9629-952f519692d3","Type":"ContainerDied","Data":"308c225f467317d5840b6b396d5856744a9e25503943a59b1794a9c218812cd1"} Apr 22 18:52:17.954624 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:17.954281 2578 scope.go:117] "RemoveContainer" containerID="d13e5a9c16bb7f11bb4fc3fc8da2182089f3274ca72c1cb6d28f44a9ac99e7d8" Apr 22 18:52:17.954624 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:17.954490 2578 scope.go:117] "RemoveContainer" containerID="308c225f467317d5840b6b396d5856744a9e25503943a59b1794a9c218812cd1" Apr 22 18:52:17.954788 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:52:17.954712 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-n2r78_opendatahub(1bdd73ee-82ea-4bf9-9629-952f519692d3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" podUID="1bdd73ee-82ea-4bf9-9629-952f519692d3" Apr 22 18:52:18.298015 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:18.297919 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-xjx2g"] Apr 22 18:52:18.302212 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:18.302194 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" Apr 22 18:52:18.307838 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:18.307818 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 22 18:52:18.307838 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:18.307825 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-t6w7g\"" Apr 22 18:52:18.374472 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:18.374443 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-xjx2g"] Apr 22 18:52:18.461784 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:18.461754 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3890bca-d60a-440d-8bb4-506cbe672756-cert\") pod \"kserve-controller-manager-856948b99f-xjx2g\" (UID: \"e3890bca-d60a-440d-8bb4-506cbe672756\") " pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" Apr 22 18:52:18.461927 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:18.461813 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndxpr\" (UniqueName: \"kubernetes.io/projected/e3890bca-d60a-440d-8bb4-506cbe672756-kube-api-access-ndxpr\") pod \"kserve-controller-manager-856948b99f-xjx2g\" (UID: \"e3890bca-d60a-440d-8bb4-506cbe672756\") " pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" Apr 22 18:52:18.562921 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:18.562833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3890bca-d60a-440d-8bb4-506cbe672756-cert\") pod \"kserve-controller-manager-856948b99f-xjx2g\" (UID: \"e3890bca-d60a-440d-8bb4-506cbe672756\") " pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" Apr 22 18:52:18.562921 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:18.562893 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndxpr\" (UniqueName: \"kubernetes.io/projected/e3890bca-d60a-440d-8bb4-506cbe672756-kube-api-access-ndxpr\") pod \"kserve-controller-manager-856948b99f-xjx2g\" (UID: \"e3890bca-d60a-440d-8bb4-506cbe672756\") " pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" Apr 22 18:52:18.563129 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:52:18.562981 2578 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 18:52:18.563129 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:52:18.563059 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3890bca-d60a-440d-8bb4-506cbe672756-cert podName:e3890bca-d60a-440d-8bb4-506cbe672756 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:19.063040572 +0000 UTC m=+605.305380959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3890bca-d60a-440d-8bb4-506cbe672756-cert") pod "kserve-controller-manager-856948b99f-xjx2g" (UID: "e3890bca-d60a-440d-8bb4-506cbe672756") : secret "kserve-webhook-server-cert" not found Apr 22 18:52:18.583803 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:18.583775 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndxpr\" (UniqueName: \"kubernetes.io/projected/e3890bca-d60a-440d-8bb4-506cbe672756-kube-api-access-ndxpr\") pod \"kserve-controller-manager-856948b99f-xjx2g\" (UID: \"e3890bca-d60a-440d-8bb4-506cbe672756\") " pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" Apr 22 18:52:18.958472 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:18.958449 2578 scope.go:117] "RemoveContainer" containerID="308c225f467317d5840b6b396d5856744a9e25503943a59b1794a9c218812cd1" Apr 22 18:52:18.958859 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:52:18.958609 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-n2r78_opendatahub(1bdd73ee-82ea-4bf9-9629-952f519692d3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" podUID="1bdd73ee-82ea-4bf9-9629-952f519692d3" Apr 22 18:52:19.065524 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:19.065492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3890bca-d60a-440d-8bb4-506cbe672756-cert\") pod \"kserve-controller-manager-856948b99f-xjx2g\" (UID: \"e3890bca-d60a-440d-8bb4-506cbe672756\") " pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" Apr 22 18:52:19.068086 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:19.068058 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3890bca-d60a-440d-8bb4-506cbe672756-cert\") pod \"kserve-controller-manager-856948b99f-xjx2g\" (UID: \"e3890bca-d60a-440d-8bb4-506cbe672756\") " pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" Apr 22 18:52:19.211595 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:19.211508 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" Apr 22 18:52:19.343864 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:19.343839 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-xjx2g"] Apr 22 18:52:19.346589 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:52:19.346558 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3890bca_d60a_440d_8bb4_506cbe672756.slice/crio-c49f6c3fe94b4983df22038e3235abb8fc687d4eed8d1790f7db33d8dccd427b WatchSource:0}: Error finding container c49f6c3fe94b4983df22038e3235abb8fc687d4eed8d1790f7db33d8dccd427b: Status 404 returned error can't find the container with id c49f6c3fe94b4983df22038e3235abb8fc687d4eed8d1790f7db33d8dccd427b Apr 22 18:52:19.962298 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:19.962261 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" event={"ID":"e3890bca-d60a-440d-8bb4-506cbe672756","Type":"ContainerStarted","Data":"c49f6c3fe94b4983df22038e3235abb8fc687d4eed8d1790f7db33d8dccd427b"} Apr 22 18:52:22.975302 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:22.975268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" event={"ID":"e3890bca-d60a-440d-8bb4-506cbe672756","Type":"ContainerStarted","Data":"cead4e7bf7be7991955efa3fdcfb935ae7b7a879d44f25a85f91049c9878681f"} Apr 22 18:52:22.975700 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:22.975397 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" Apr 22 18:52:23.017178 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:23.017129 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" podStartSLOduration=2.215596747 podStartE2EDuration="5.017114978s" podCreationTimestamp="2026-04-22 18:52:18 +0000 UTC" firstStartedPulling="2026-04-22 18:52:19.348326225 +0000 UTC m=+605.590666602" lastFinishedPulling="2026-04-22 18:52:22.149844455 +0000 UTC m=+608.392184833" observedRunningTime="2026-04-22 18:52:23.016859177 +0000 UTC m=+609.259199576" watchObservedRunningTime="2026-04-22 18:52:23.017114978 +0000 UTC m=+609.259455376" Apr 22 18:52:23.137020 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:23.136988 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:23.137340 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:23.137328 2578 scope.go:117] "RemoveContainer" containerID="308c225f467317d5840b6b396d5856744a9e25503943a59b1794a9c218812cd1" Apr 22 18:52:23.137540 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:52:23.137525 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-n2r78_opendatahub(1bdd73ee-82ea-4bf9-9629-952f519692d3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" podUID="1bdd73ee-82ea-4bf9-9629-952f519692d3" Apr 22 18:52:24.615665 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.614052 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn"] Apr 22 18:52:24.624443 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.624418 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" Apr 22 18:52:24.627524 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.627497 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 18:52:24.628165 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.628142 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-h8dnm\"" Apr 22 18:52:24.628886 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.628857 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 18:52:24.629022 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.628894 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn"] Apr 22 18:52:24.706444 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.706406 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/72035573-40e8-4d8a-aa74-0fa510bb2123-operator-config\") pod \"servicemesh-operator3-55f49c5f94-5f6fn\" (UID: \"72035573-40e8-4d8a-aa74-0fa510bb2123\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" Apr 22 18:52:24.706444 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.706442 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl7hm\" (UniqueName: \"kubernetes.io/projected/72035573-40e8-4d8a-aa74-0fa510bb2123-kube-api-access-tl7hm\") pod \"servicemesh-operator3-55f49c5f94-5f6fn\" (UID: \"72035573-40e8-4d8a-aa74-0fa510bb2123\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" Apr 22 18:52:24.807577 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.807538 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/72035573-40e8-4d8a-aa74-0fa510bb2123-operator-config\") pod \"servicemesh-operator3-55f49c5f94-5f6fn\" (UID: \"72035573-40e8-4d8a-aa74-0fa510bb2123\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" Apr 22 18:52:24.807750 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.807581 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tl7hm\" (UniqueName: \"kubernetes.io/projected/72035573-40e8-4d8a-aa74-0fa510bb2123-kube-api-access-tl7hm\") pod \"servicemesh-operator3-55f49c5f94-5f6fn\" (UID: \"72035573-40e8-4d8a-aa74-0fa510bb2123\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" Apr 22 18:52:24.810237 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.810199 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/72035573-40e8-4d8a-aa74-0fa510bb2123-operator-config\") pod \"servicemesh-operator3-55f49c5f94-5f6fn\" (UID: \"72035573-40e8-4d8a-aa74-0fa510bb2123\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" Apr 22 18:52:24.829668 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.829629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl7hm\" (UniqueName: \"kubernetes.io/projected/72035573-40e8-4d8a-aa74-0fa510bb2123-kube-api-access-tl7hm\") pod \"servicemesh-operator3-55f49c5f94-5f6fn\" (UID: \"72035573-40e8-4d8a-aa74-0fa510bb2123\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" Apr 22 18:52:24.934471 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:24.934439 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" Apr 22 18:52:25.073871 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:25.073837 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn"] Apr 22 18:52:25.079132 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:52:25.079104 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72035573_40e8_4d8a_aa74_0fa510bb2123.slice/crio-979b1dc8a6c6295f5f4c8fe7501459ea6f7b204928b238d57278494cd5a1487d WatchSource:0}: Error finding container 979b1dc8a6c6295f5f4c8fe7501459ea6f7b204928b238d57278494cd5a1487d: Status 404 returned error can't find the container with id 979b1dc8a6c6295f5f4c8fe7501459ea6f7b204928b238d57278494cd5a1487d Apr 22 18:52:25.988109 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:25.988048 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" event={"ID":"72035573-40e8-4d8a-aa74-0fa510bb2123","Type":"ContainerStarted","Data":"979b1dc8a6c6295f5f4c8fe7501459ea6f7b204928b238d57278494cd5a1487d"} Apr 22 18:52:27.996377 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:27.996342 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" event={"ID":"72035573-40e8-4d8a-aa74-0fa510bb2123","Type":"ContainerStarted","Data":"cf22e556fd87f055f75502619c181b220989226dbe1ba1f36426a875e906ade7"} Apr 22 18:52:27.996751 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:27.996399 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" Apr 22 18:52:28.037972 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:28.037920 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" podStartSLOduration=1.615217319 podStartE2EDuration="4.037906766s" podCreationTimestamp="2026-04-22 18:52:24 +0000 UTC" firstStartedPulling="2026-04-22 18:52:25.08150225 +0000 UTC m=+611.323842626" lastFinishedPulling="2026-04-22 18:52:27.504191693 +0000 UTC m=+613.746532073" observedRunningTime="2026-04-22 18:52:28.037658796 +0000 UTC m=+614.279999197" watchObservedRunningTime="2026-04-22 18:52:28.037906766 +0000 UTC m=+614.280247164" Apr 22 18:52:33.136839 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:33.136791 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:33.137200 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:33.137172 2578 scope.go:117] "RemoveContainer" containerID="308c225f467317d5840b6b396d5856744a9e25503943a59b1794a9c218812cd1" Apr 22 18:52:34.015984 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:34.015948 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" event={"ID":"1bdd73ee-82ea-4bf9-9629-952f519692d3","Type":"ContainerStarted","Data":"e7b3d61cbd13bbae0e05122907888988564421754bbc8e06b37a9779fdb75b76"} Apr 22 18:52:34.016176 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:34.016156 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:34.038068 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:34.038024 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" podStartSLOduration=1.910872925 podStartE2EDuration="22.038010946s" podCreationTimestamp="2026-04-22 18:52:12 +0000 UTC" firstStartedPulling="2026-04-22 18:52:13.266074523 +0000 UTC m=+599.508414901" lastFinishedPulling="2026-04-22 18:52:33.393212544 +0000 UTC m=+619.635552922" observedRunningTime="2026-04-22 18:52:34.037019301 +0000 UTC m=+620.279359700" watchObservedRunningTime="2026-04-22 18:52:34.038010946 +0000 UTC m=+620.280351367" Apr 22 18:52:39.001837 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:39.001804 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5f6fn" Apr 22 18:52:45.022070 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:45.022040 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-n2r78" Apr 22 18:52:53.984386 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:52:53.984350 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-xjx2g" Apr 22 18:53:08.503084 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:08.503046 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-9798n"] Apr 22 18:53:08.506235 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:08.506214 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-9798n" Apr 22 18:53:08.508924 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:08.508902 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-qr42t\"" Apr 22 18:53:08.508924 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:08.508917 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:53:08.509134 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:08.509008 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:53:08.515880 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:08.515859 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-9798n"] Apr 22 18:53:08.612770 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:08.612738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhntr\" (UniqueName: \"kubernetes.io/projected/f88426d2-d14e-4a09-8a41-e021bd41c917-kube-api-access-fhntr\") pod \"kuadrant-operator-catalog-9798n\" (UID: \"f88426d2-d14e-4a09-8a41-e021bd41c917\") " pod="kuadrant-system/kuadrant-operator-catalog-9798n" Apr 22 18:53:08.713909 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:08.713875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhntr\" (UniqueName: \"kubernetes.io/projected/f88426d2-d14e-4a09-8a41-e021bd41c917-kube-api-access-fhntr\") pod \"kuadrant-operator-catalog-9798n\" (UID: \"f88426d2-d14e-4a09-8a41-e021bd41c917\") " pod="kuadrant-system/kuadrant-operator-catalog-9798n" Apr 22 18:53:08.722698 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:08.722663 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhntr\" (UniqueName: \"kubernetes.io/projected/f88426d2-d14e-4a09-8a41-e021bd41c917-kube-api-access-fhntr\") pod \"kuadrant-operator-catalog-9798n\" (UID: \"f88426d2-d14e-4a09-8a41-e021bd41c917\") " pod="kuadrant-system/kuadrant-operator-catalog-9798n" Apr 22 18:53:08.815044 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:08.814964 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-9798n" Apr 22 18:53:08.869938 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:08.869901 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-9798n"] Apr 22 18:53:08.949604 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:08.949574 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-9798n"] Apr 22 18:53:08.953199 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:53:08.953174 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf88426d2_d14e_4a09_8a41_e021bd41c917.slice/crio-114d61085ced481fa60c3946853dcc1e92ffa1c61ac330e5943903d30930b5c5 WatchSource:0}: Error finding container 114d61085ced481fa60c3946853dcc1e92ffa1c61ac330e5943903d30930b5c5: Status 404 returned error can't find the container with id 114d61085ced481fa60c3946853dcc1e92ffa1c61ac330e5943903d30930b5c5 Apr 22 18:53:09.082876 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.082790 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wz2bw"] Apr 22 18:53:09.087226 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.087204 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wz2bw" Apr 22 18:53:09.095606 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.095578 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wz2bw"] Apr 22 18:53:09.135733 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.135693 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-9798n" event={"ID":"f88426d2-d14e-4a09-8a41-e021bd41c917","Type":"ContainerStarted","Data":"114d61085ced481fa60c3946853dcc1e92ffa1c61ac330e5943903d30930b5c5"} Apr 22 18:53:09.218179 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.218141 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9p7b\" (UniqueName: \"kubernetes.io/projected/d899368d-1ef9-4808-a89b-c3541844e345-kube-api-access-m9p7b\") pod \"kuadrant-operator-catalog-wz2bw\" (UID: \"d899368d-1ef9-4808-a89b-c3541844e345\") " pod="kuadrant-system/kuadrant-operator-catalog-wz2bw" Apr 22 18:53:09.319679 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.319646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9p7b\" (UniqueName: \"kubernetes.io/projected/d899368d-1ef9-4808-a89b-c3541844e345-kube-api-access-m9p7b\") pod \"kuadrant-operator-catalog-wz2bw\" (UID: \"d899368d-1ef9-4808-a89b-c3541844e345\") " pod="kuadrant-system/kuadrant-operator-catalog-wz2bw" Apr 22 18:53:09.330567 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.330533 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9p7b\" (UniqueName: \"kubernetes.io/projected/d899368d-1ef9-4808-a89b-c3541844e345-kube-api-access-m9p7b\") pod \"kuadrant-operator-catalog-wz2bw\" (UID: \"d899368d-1ef9-4808-a89b-c3541844e345\") " pod="kuadrant-system/kuadrant-operator-catalog-wz2bw" Apr 22 18:53:09.397429 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.397346 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wz2bw" Apr 22 18:53:09.479977 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.479853 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk"] Apr 22 18:53:09.483259 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.483214 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.487816 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.487790 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 18:53:09.488946 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.488512 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 18:53:09.488946 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.488526 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 18:53:09.488946 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.488763 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 18:53:09.490366 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.490199 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-dkf85\"" Apr 22 18:53:09.510929 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.510898 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk"] Apr 22 18:53:09.521163 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.521119 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.521314 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.521222 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.521314 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.521254 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.521314 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.521280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj8cd\" (UniqueName: \"kubernetes.io/projected/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-kube-api-access-gj8cd\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.521453 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.521365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.521453 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.521391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.521453 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.521430 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.552554 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.552526 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wz2bw"] Apr 22 18:53:09.586319 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:53:09.586288 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd899368d_1ef9_4808_a89b_c3541844e345.slice/crio-4466e2c435d48204797ace5e870a7e48f90d675533acb8ed1f6b83c8deef9e06 WatchSource:0}: Error finding container 4466e2c435d48204797ace5e870a7e48f90d675533acb8ed1f6b83c8deef9e06: Status 404 returned error can't find the container with id 4466e2c435d48204797ace5e870a7e48f90d675533acb8ed1f6b83c8deef9e06 Apr 22 18:53:09.622060 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.622025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.622215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.622074 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.622215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.622108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gj8cd\" (UniqueName: \"kubernetes.io/projected/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-kube-api-access-gj8cd\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.622215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.622144 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.622215 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.622176 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.622433 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.622224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.622433 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.622251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.622684 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.622623 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.624728 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.624705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.624833 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.624809 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.625126 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.625109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.625270 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.625251 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.633294 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.633273 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.633417 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.633306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj8cd\" (UniqueName: \"kubernetes.io/projected/4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa-kube-api-access-gj8cd\") pod \"istiod-openshift-gateway-55ff986f96-ps5kk\" (UID: \"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.795625 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.795592 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:09.968690 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:09.968610 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk"] Apr 22 18:53:09.992611 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:53:09.992576 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c6b2d02_cb8b_45e2_961f_7d4f5e2658aa.slice/crio-0272d6ef55d2e03f182c4526d4ec89005cc6a4a64481d9c2daf880bf17dea181 WatchSource:0}: Error finding container 0272d6ef55d2e03f182c4526d4ec89005cc6a4a64481d9c2daf880bf17dea181: Status 404 returned error can't find the container with id 0272d6ef55d2e03f182c4526d4ec89005cc6a4a64481d9c2daf880bf17dea181 Apr 22 18:53:10.140855 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:10.140777 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wz2bw" event={"ID":"d899368d-1ef9-4808-a89b-c3541844e345","Type":"ContainerStarted","Data":"4466e2c435d48204797ace5e870a7e48f90d675533acb8ed1f6b83c8deef9e06"} Apr 22 18:53:10.141996 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:10.141970 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" event={"ID":"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa","Type":"ContainerStarted","Data":"0272d6ef55d2e03f182c4526d4ec89005cc6a4a64481d9c2daf880bf17dea181"} Apr 22 18:53:12.150805 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:12.150768 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-9798n" event={"ID":"f88426d2-d14e-4a09-8a41-e021bd41c917","Type":"ContainerStarted","Data":"bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c"} Apr 22 18:53:12.151301 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:12.150843 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-9798n" podUID="f88426d2-d14e-4a09-8a41-e021bd41c917" containerName="registry-server" containerID="cri-o://bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c" gracePeriod=2 Apr 22 18:53:12.153441 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:12.153399 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wz2bw" event={"ID":"d899368d-1ef9-4808-a89b-c3541844e345","Type":"ContainerStarted","Data":"6e78d64722e5f974147ddaf0a9cf476865ba79ac35ec10795758856272ae1417"} Apr 22 18:53:12.167569 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:12.167510 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-9798n" podStartSLOduration=1.872078152 podStartE2EDuration="4.167492225s" podCreationTimestamp="2026-04-22 18:53:08 +0000 UTC" firstStartedPulling="2026-04-22 18:53:08.954966298 +0000 UTC m=+655.197306675" lastFinishedPulling="2026-04-22 18:53:11.250380368 +0000 UTC m=+657.492720748" observedRunningTime="2026-04-22 18:53:12.166624864 +0000 UTC m=+658.408965264" watchObservedRunningTime="2026-04-22 18:53:12.167492225 +0000 UTC m=+658.409832625" Apr 22 18:53:12.182208 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:12.182164 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-wz2bw" podStartSLOduration=1.515903024 podStartE2EDuration="3.182148438s" podCreationTimestamp="2026-04-22 18:53:09 +0000 UTC" firstStartedPulling="2026-04-22 18:53:09.587674431 +0000 UTC m=+655.830014807" lastFinishedPulling="2026-04-22 18:53:11.253919832 +0000 UTC m=+657.496260221" observedRunningTime="2026-04-22 18:53:12.180522379 +0000 UTC m=+658.422862780" watchObservedRunningTime="2026-04-22 18:53:12.182148438 +0000 UTC m=+658.424488869" Apr 22 18:53:13.020519 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.020482 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 22 18:53:13.020664 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.020546 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 22 18:53:13.096496 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.096477 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-9798n" Apr 22 18:53:13.155150 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.155113 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhntr\" (UniqueName: \"kubernetes.io/projected/f88426d2-d14e-4a09-8a41-e021bd41c917-kube-api-access-fhntr\") pod \"f88426d2-d14e-4a09-8a41-e021bd41c917\" (UID: \"f88426d2-d14e-4a09-8a41-e021bd41c917\") " Apr 22 18:53:13.158042 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.158006 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88426d2-d14e-4a09-8a41-e021bd41c917-kube-api-access-fhntr" (OuterVolumeSpecName: "kube-api-access-fhntr") pod "f88426d2-d14e-4a09-8a41-e021bd41c917" (UID: "f88426d2-d14e-4a09-8a41-e021bd41c917"). InnerVolumeSpecName "kube-api-access-fhntr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:13.158271 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.158251 2578 generic.go:358] "Generic (PLEG): container finished" podID="f88426d2-d14e-4a09-8a41-e021bd41c917" containerID="bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c" exitCode=0 Apr 22 18:53:13.158334 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.158308 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-9798n" Apr 22 18:53:13.158372 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.158353 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-9798n" event={"ID":"f88426d2-d14e-4a09-8a41-e021bd41c917","Type":"ContainerDied","Data":"bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c"} Apr 22 18:53:13.158413 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.158382 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-9798n" event={"ID":"f88426d2-d14e-4a09-8a41-e021bd41c917","Type":"ContainerDied","Data":"114d61085ced481fa60c3946853dcc1e92ffa1c61ac330e5943903d30930b5c5"} Apr 22 18:53:13.158413 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.158403 2578 scope.go:117] "RemoveContainer" containerID="bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c" Apr 22 18:53:13.160087 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.160060 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" event={"ID":"4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa","Type":"ContainerStarted","Data":"471772031a886717c082827c7f0b097e05099a33dfc6c3c4c402e179227ee7b7"} Apr 22 18:53:13.171241 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.171222 2578 scope.go:117] "RemoveContainer" containerID="bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c" Apr 22 18:53:13.171544 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:53:13.171514 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c\": container with ID starting with bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c not found: ID does not exist" containerID="bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c" Apr 22 18:53:13.171602 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.171554 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c"} err="failed to get container status \"bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c\": rpc error: code = NotFound desc = could not find container \"bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c\": container with ID starting with bc48d4bcc1b13f62d78065a951b9e09e9bd729796aec31ffaf9bc60c80ac5d4c not found: ID does not exist" Apr 22 18:53:13.187098 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.187052 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" podStartSLOduration=1.164197957 podStartE2EDuration="4.187039278s" podCreationTimestamp="2026-04-22 18:53:09 +0000 UTC" firstStartedPulling="2026-04-22 18:53:09.99741676 +0000 UTC m=+656.239757148" lastFinishedPulling="2026-04-22 18:53:13.020258092 +0000 UTC m=+659.262598469" observedRunningTime="2026-04-22 18:53:13.185364043 +0000 UTC m=+659.427704444" watchObservedRunningTime="2026-04-22 18:53:13.187039278 +0000 UTC m=+659.429379678" Apr 22 18:53:13.204830 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.204803 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-9798n"] Apr 22 18:53:13.208210 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.208186 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-9798n"] Apr 22 18:53:13.256736 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:13.256653 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fhntr\" (UniqueName: \"kubernetes.io/projected/f88426d2-d14e-4a09-8a41-e021bd41c917-kube-api-access-fhntr\") on node \"ip-10-0-129-85.ec2.internal\" DevicePath \"\"" Apr 22 18:53:14.166782 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:14.166743 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:14.168381 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:14.168333 2578 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-ps5kk container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 18:53:14.168513 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:14.168400 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" podUID="4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:53:14.298289 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:14.298256 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88426d2-d14e-4a09-8a41-e021bd41c917" path="/var/lib/kubelet/pods/f88426d2-d14e-4a09-8a41-e021bd41c917/volumes" Apr 22 18:53:15.170178 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:15.170144 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-ps5kk" Apr 22 18:53:19.397598 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:19.397556 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-wz2bw" Apr 22 18:53:19.398022 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:19.397614 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-wz2bw" Apr 22 18:53:19.418627 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:19.418603 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-wz2bw" Apr 22 18:53:20.206174 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:20.206147 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-wz2bw" Apr 22 18:53:43.411141 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.411103 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw"] Apr 22 18:53:43.411592 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.411385 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f88426d2-d14e-4a09-8a41-e021bd41c917" containerName="registry-server" Apr 22 18:53:43.411592 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.411395 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f88426d2-d14e-4a09-8a41-e021bd41c917" containerName="registry-server" Apr 22 18:53:43.411592 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.411454 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f88426d2-d14e-4a09-8a41-e021bd41c917" containerName="registry-server" Apr 22 18:53:43.414475 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.414458 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw" Apr 22 18:53:43.417057 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.417029 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 22 18:53:43.417197 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.417146 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-2g8z7\"" Apr 22 18:53:43.423797 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.423774 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw"] Apr 22 18:53:43.597145 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.597091 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnlkg\" (UniqueName: \"kubernetes.io/projected/7796bb22-56d3-4eb5-a146-cced15299212-kube-api-access-bnlkg\") pod \"dns-operator-controller-manager-648d5c98bc-9l6dw\" (UID: \"7796bb22-56d3-4eb5-a146-cced15299212\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw" Apr 22 18:53:43.698498 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.698465 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnlkg\" (UniqueName: \"kubernetes.io/projected/7796bb22-56d3-4eb5-a146-cced15299212-kube-api-access-bnlkg\") pod \"dns-operator-controller-manager-648d5c98bc-9l6dw\" (UID: \"7796bb22-56d3-4eb5-a146-cced15299212\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw" Apr 22 18:53:43.708563 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.708533 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnlkg\" (UniqueName: \"kubernetes.io/projected/7796bb22-56d3-4eb5-a146-cced15299212-kube-api-access-bnlkg\") pod \"dns-operator-controller-manager-648d5c98bc-9l6dw\" (UID: \"7796bb22-56d3-4eb5-a146-cced15299212\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw" Apr 22 18:53:43.725203 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.725174 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw" Apr 22 18:53:43.845511 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:43.845475 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw"] Apr 22 18:53:43.848794 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:53:43.848763 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7796bb22_56d3_4eb5_a146_cced15299212.slice/crio-91e6557cf60c8dfdd15ffd355c7128317341b6aa63c4f2ebf68533b35878289c WatchSource:0}: Error finding container 91e6557cf60c8dfdd15ffd355c7128317341b6aa63c4f2ebf68533b35878289c: Status 404 returned error can't find the container with id 91e6557cf60c8dfdd15ffd355c7128317341b6aa63c4f2ebf68533b35878289c Apr 22 18:53:44.256880 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:44.256845 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw" event={"ID":"7796bb22-56d3-4eb5-a146-cced15299212","Type":"ContainerStarted","Data":"91e6557cf60c8dfdd15ffd355c7128317341b6aa63c4f2ebf68533b35878289c"} Apr 22 18:53:46.264279 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:46.264163 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785"] Apr 22 18:53:46.267859 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:46.267837 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" Apr 22 18:53:46.271481 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:46.271458 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-99qgp\"" Apr 22 18:53:46.280982 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:46.280959 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785"] Apr 22 18:53:46.416957 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:46.416931 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcfds\" (UniqueName: \"kubernetes.io/projected/e7303a9f-0f43-4a67-b004-63e89d607c77-kube-api-access-qcfds\") pod \"limitador-operator-controller-manager-85c4996f8c-4z785\" (UID: \"e7303a9f-0f43-4a67-b004-63e89d607c77\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" Apr 22 18:53:46.517902 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:46.517805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcfds\" (UniqueName: \"kubernetes.io/projected/e7303a9f-0f43-4a67-b004-63e89d607c77-kube-api-access-qcfds\") pod \"limitador-operator-controller-manager-85c4996f8c-4z785\" (UID: \"e7303a9f-0f43-4a67-b004-63e89d607c77\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" Apr 22 18:53:46.531577 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:46.531549 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcfds\" (UniqueName: \"kubernetes.io/projected/e7303a9f-0f43-4a67-b004-63e89d607c77-kube-api-access-qcfds\") pod \"limitador-operator-controller-manager-85c4996f8c-4z785\" (UID: \"e7303a9f-0f43-4a67-b004-63e89d607c77\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" Apr 22 18:53:46.579268 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:46.579228 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" Apr 22 18:53:46.704179 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:46.704156 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785"] Apr 22 18:53:46.706896 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:53:46.706869 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7303a9f_0f43_4a67_b004_63e89d607c77.slice/crio-9b408a7f341c98b161b9a245b70daa039c6562556681bd98b717566f3411a9b0 WatchSource:0}: Error finding container 9b408a7f341c98b161b9a245b70daa039c6562556681bd98b717566f3411a9b0: Status 404 returned error can't find the container with id 9b408a7f341c98b161b9a245b70daa039c6562556681bd98b717566f3411a9b0 Apr 22 18:53:47.271861 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:47.271558 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw" event={"ID":"7796bb22-56d3-4eb5-a146-cced15299212","Type":"ContainerStarted","Data":"c7972fea4323ed9900b52ee2e9ded4dba858458cbfd23df202318577c1ad49d2"} Apr 22 18:53:47.271861 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:47.271702 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw" Apr 22 18:53:47.273321 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:47.273288 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" event={"ID":"e7303a9f-0f43-4a67-b004-63e89d607c77","Type":"ContainerStarted","Data":"9b408a7f341c98b161b9a245b70daa039c6562556681bd98b717566f3411a9b0"} Apr 22 18:53:47.305446 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:47.305394 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw" podStartSLOduration=1.821588539 podStartE2EDuration="4.305377898s" podCreationTimestamp="2026-04-22 18:53:43 +0000 UTC" firstStartedPulling="2026-04-22 18:53:43.851154295 +0000 UTC m=+690.093494672" lastFinishedPulling="2026-04-22 18:53:46.334943653 +0000 UTC m=+692.577284031" observedRunningTime="2026-04-22 18:53:47.30488701 +0000 UTC m=+693.547227422" watchObservedRunningTime="2026-04-22 18:53:47.305377898 +0000 UTC m=+693.547718296" Apr 22 18:53:49.282993 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:49.282962 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" event={"ID":"e7303a9f-0f43-4a67-b004-63e89d607c77","Type":"ContainerStarted","Data":"c57423957eff3e9d48aa87d7d508362902cd470cf6a4f3b01cebef4f330d366e"} Apr 22 18:53:49.283357 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:49.283119 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" Apr 22 18:53:49.300154 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:49.300106 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" podStartSLOduration=1.4251690639999999 podStartE2EDuration="3.300091206s" podCreationTimestamp="2026-04-22 18:53:46 +0000 UTC" firstStartedPulling="2026-04-22 18:53:46.708775524 +0000 UTC m=+692.951115902" lastFinishedPulling="2026-04-22 18:53:48.583697653 +0000 UTC m=+694.826038044" observedRunningTime="2026-04-22 18:53:49.299672292 +0000 UTC m=+695.542012683" watchObservedRunningTime="2026-04-22 18:53:49.300091206 +0000 UTC m=+695.542431605" Apr 22 18:53:58.279783 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:53:58.279752 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9l6dw" Apr 22 18:54:00.249657 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.249618 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm"] Apr 22 18:54:00.253028 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.253008 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" Apr 22 18:54:00.255707 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.255688 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-brdcv\"" Apr 22 18:54:00.264138 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.264113 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm"] Apr 22 18:54:00.291926 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.291901 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" Apr 22 18:54:00.309407 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.309364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/74443004-09e1-4239-bfad-d99f7d8f4682-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-mk8xm\" (UID: \"74443004-09e1-4239-bfad-d99f7d8f4682\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" Apr 22 18:54:00.309606 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.309437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8b4k\" (UniqueName: \"kubernetes.io/projected/74443004-09e1-4239-bfad-d99f7d8f4682-kube-api-access-v8b4k\") pod \"kuadrant-operator-controller-manager-84b657d985-mk8xm\" (UID: \"74443004-09e1-4239-bfad-d99f7d8f4682\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" Apr 22 18:54:00.352065 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.352027 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm"] Apr 22 18:54:00.352291 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:54:00.352267 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-v8b4k], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" podUID="74443004-09e1-4239-bfad-d99f7d8f4682" Apr 22 18:54:00.410203 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.410171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/74443004-09e1-4239-bfad-d99f7d8f4682-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-mk8xm\" (UID: \"74443004-09e1-4239-bfad-d99f7d8f4682\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" Apr 22 18:54:00.410357 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.410229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8b4k\" (UniqueName: \"kubernetes.io/projected/74443004-09e1-4239-bfad-d99f7d8f4682-kube-api-access-v8b4k\") pod \"kuadrant-operator-controller-manager-84b657d985-mk8xm\" (UID: \"74443004-09e1-4239-bfad-d99f7d8f4682\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" Apr 22 18:54:00.410604 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.410580 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/74443004-09e1-4239-bfad-d99f7d8f4682-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-mk8xm\" (UID: \"74443004-09e1-4239-bfad-d99f7d8f4682\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" Apr 22 18:54:00.424614 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.424581 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8b4k\" (UniqueName: \"kubernetes.io/projected/74443004-09e1-4239-bfad-d99f7d8f4682-kube-api-access-v8b4k\") pod \"kuadrant-operator-controller-manager-84b657d985-mk8xm\" (UID: \"74443004-09e1-4239-bfad-d99f7d8f4682\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" Apr 22 18:54:00.501123 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.501044 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm"] Apr 22 18:54:00.507485 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.507457 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm"] Apr 22 18:54:00.515709 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.515672 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785"] Apr 22 18:54:00.515948 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.515907 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" podUID="e7303a9f-0f43-4a67-b004-63e89d607c77" containerName="manager" containerID="cri-o://c57423957eff3e9d48aa87d7d508362902cd470cf6a4f3b01cebef4f330d366e" gracePeriod=2 Apr 22 18:54:00.529339 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.529311 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785"] Apr 22 18:54:00.539024 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.539000 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj"] Apr 22 18:54:00.539278 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.539267 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7303a9f-0f43-4a67-b004-63e89d607c77" containerName="manager" Apr 22 18:54:00.539325 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.539280 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7303a9f-0f43-4a67-b004-63e89d607c77" containerName="manager" Apr 22 18:54:00.539358 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.539335 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7303a9f-0f43-4a67-b004-63e89d607c77" containerName="manager" Apr 22 18:54:00.542138 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.542120 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj" Apr 22 18:54:00.544548 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.544522 2578 status_manager.go:895] "Failed to get status for pod" podUID="e7303a9f-0f43-4a67-b004-63e89d607c77" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z785\" is forbidden: User \"system:node:ip-10-0-129-85.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-85.ec2.internal' and this object" Apr 22 18:54:00.553804 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.553778 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj"] Apr 22 18:54:00.612017 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.611990 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzxcm\" (UniqueName: \"kubernetes.io/projected/fc6d109a-4012-4df2-9871-766aed0933c1-kube-api-access-qzxcm\") pod \"limitador-operator-controller-manager-85c4996f8c-sdfvj\" (UID: \"fc6d109a-4012-4df2-9871-766aed0933c1\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj" Apr 22 18:54:00.712866 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.712836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzxcm\" (UniqueName: \"kubernetes.io/projected/fc6d109a-4012-4df2-9871-766aed0933c1-kube-api-access-qzxcm\") pod \"limitador-operator-controller-manager-85c4996f8c-sdfvj\" (UID: \"fc6d109a-4012-4df2-9871-766aed0933c1\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj" Apr 22 18:54:00.721738 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.721711 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzxcm\" (UniqueName: \"kubernetes.io/projected/fc6d109a-4012-4df2-9871-766aed0933c1-kube-api-access-qzxcm\") pod \"limitador-operator-controller-manager-85c4996f8c-sdfvj\" (UID: \"fc6d109a-4012-4df2-9871-766aed0933c1\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj" Apr 22 18:54:00.744915 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.744892 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" Apr 22 18:54:00.747211 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.747187 2578 status_manager.go:895] "Failed to get status for pod" podUID="e7303a9f-0f43-4a67-b004-63e89d607c77" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z785\" is forbidden: User \"system:node:ip-10-0-129-85.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-85.ec2.internal' and this object" Apr 22 18:54:00.814134 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.814062 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcfds\" (UniqueName: \"kubernetes.io/projected/e7303a9f-0f43-4a67-b004-63e89d607c77-kube-api-access-qcfds\") pod \"e7303a9f-0f43-4a67-b004-63e89d607c77\" (UID: \"e7303a9f-0f43-4a67-b004-63e89d607c77\") " Apr 22 18:54:00.816191 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.816166 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7303a9f-0f43-4a67-b004-63e89d607c77-kube-api-access-qcfds" (OuterVolumeSpecName: "kube-api-access-qcfds") pod "e7303a9f-0f43-4a67-b004-63e89d607c77" (UID: "e7303a9f-0f43-4a67-b004-63e89d607c77"). InnerVolumeSpecName "kube-api-access-qcfds". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:00.900929 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.900901 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj" Apr 22 18:54:00.915009 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:00.914981 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qcfds\" (UniqueName: \"kubernetes.io/projected/e7303a9f-0f43-4a67-b004-63e89d607c77-kube-api-access-qcfds\") on node \"ip-10-0-129-85.ec2.internal\" DevicePath \"\"" Apr 22 18:54:01.026854 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.026762 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj"] Apr 22 18:54:01.029372 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:54:01.029345 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc6d109a_4012_4df2_9871_766aed0933c1.slice/crio-4c2ca512e9d2e582bd49df59ac849aad6347eaa73aea1148a753912d0371e5ed WatchSource:0}: Error finding container 4c2ca512e9d2e582bd49df59ac849aad6347eaa73aea1148a753912d0371e5ed: Status 404 returned error can't find the container with id 4c2ca512e9d2e582bd49df59ac849aad6347eaa73aea1148a753912d0371e5ed Apr 22 18:54:01.323080 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.322968 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj" event={"ID":"fc6d109a-4012-4df2-9871-766aed0933c1","Type":"ContainerStarted","Data":"78eba0726741c3459a05eff8b8549a48dbfbd9b1e79a9c923a64070db434722e"} Apr 22 18:54:01.323080 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.323023 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj" event={"ID":"fc6d109a-4012-4df2-9871-766aed0933c1","Type":"ContainerStarted","Data":"4c2ca512e9d2e582bd49df59ac849aad6347eaa73aea1148a753912d0371e5ed"} Apr 22 18:54:01.323080 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.323072 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj" Apr 22 18:54:01.324112 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.324086 2578 generic.go:358] "Generic (PLEG): container finished" podID="e7303a9f-0f43-4a67-b004-63e89d607c77" containerID="c57423957eff3e9d48aa87d7d508362902cd470cf6a4f3b01cebef4f330d366e" exitCode=0 Apr 22 18:54:01.324222 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.324139 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" Apr 22 18:54:01.324222 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.324184 2578 scope.go:117] "RemoveContainer" containerID="c57423957eff3e9d48aa87d7d508362902cd470cf6a4f3b01cebef4f330d366e" Apr 22 18:54:01.324302 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.324294 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" Apr 22 18:54:01.325548 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.325520 2578 status_manager.go:895] "Failed to get status for pod" podUID="e7303a9f-0f43-4a67-b004-63e89d607c77" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z785\" is forbidden: User \"system:node:ip-10-0-129-85.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-85.ec2.internal' and this object" Apr 22 18:54:01.330087 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.330065 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" Apr 22 18:54:01.334294 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.334269 2578 scope.go:117] "RemoveContainer" containerID="c57423957eff3e9d48aa87d7d508362902cd470cf6a4f3b01cebef4f330d366e" Apr 22 18:54:01.334546 ip-10-0-129-85 kubenswrapper[2578]: E0422 18:54:01.334528 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c57423957eff3e9d48aa87d7d508362902cd470cf6a4f3b01cebef4f330d366e\": container with ID starting with c57423957eff3e9d48aa87d7d508362902cd470cf6a4f3b01cebef4f330d366e not found: ID does not exist" containerID="c57423957eff3e9d48aa87d7d508362902cd470cf6a4f3b01cebef4f330d366e" Apr 22 18:54:01.334595 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.334555 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c57423957eff3e9d48aa87d7d508362902cd470cf6a4f3b01cebef4f330d366e"} err="failed to get container status \"c57423957eff3e9d48aa87d7d508362902cd470cf6a4f3b01cebef4f330d366e\": rpc error: code = NotFound desc = could not find container \"c57423957eff3e9d48aa87d7d508362902cd470cf6a4f3b01cebef4f330d366e\": container with ID starting with c57423957eff3e9d48aa87d7d508362902cd470cf6a4f3b01cebef4f330d366e not found: ID does not exist" Apr 22 18:54:01.346613 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.346572 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj" podStartSLOduration=1.346561512 podStartE2EDuration="1.346561512s" podCreationTimestamp="2026-04-22 18:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:54:01.344875246 +0000 UTC m=+707.587215646" watchObservedRunningTime="2026-04-22 18:54:01.346561512 +0000 UTC m=+707.588901911" Apr 22 18:54:01.347028 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.347005 2578 status_manager.go:895] "Failed to get status for pod" podUID="e7303a9f-0f43-4a67-b004-63e89d607c77" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z785" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z785\" is forbidden: User \"system:node:ip-10-0-129-85.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-85.ec2.internal' and this object" Apr 22 18:54:01.348995 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.348968 2578 status_manager.go:895] "Failed to get status for pod" podUID="74443004-09e1-4239-bfad-d99f7d8f4682" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" err="pods \"kuadrant-operator-controller-manager-84b657d985-mk8xm\" is forbidden: User \"system:node:ip-10-0-129-85.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-85.ec2.internal' and this object" Apr 22 18:54:01.418428 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.418392 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8b4k\" (UniqueName: \"kubernetes.io/projected/74443004-09e1-4239-bfad-d99f7d8f4682-kube-api-access-v8b4k\") pod \"74443004-09e1-4239-bfad-d99f7d8f4682\" (UID: \"74443004-09e1-4239-bfad-d99f7d8f4682\") " Apr 22 18:54:01.418562 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.418451 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/74443004-09e1-4239-bfad-d99f7d8f4682-extensions-socket-volume\") pod \"74443004-09e1-4239-bfad-d99f7d8f4682\" (UID: \"74443004-09e1-4239-bfad-d99f7d8f4682\") " Apr 22 18:54:01.418715 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.418690 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74443004-09e1-4239-bfad-d99f7d8f4682-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "74443004-09e1-4239-bfad-d99f7d8f4682" (UID: "74443004-09e1-4239-bfad-d99f7d8f4682"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:01.420555 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.420530 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74443004-09e1-4239-bfad-d99f7d8f4682-kube-api-access-v8b4k" (OuterVolumeSpecName: "kube-api-access-v8b4k") pod "74443004-09e1-4239-bfad-d99f7d8f4682" (UID: "74443004-09e1-4239-bfad-d99f7d8f4682"). InnerVolumeSpecName "kube-api-access-v8b4k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:01.519085 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.519049 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v8b4k\" (UniqueName: \"kubernetes.io/projected/74443004-09e1-4239-bfad-d99f7d8f4682-kube-api-access-v8b4k\") on node \"ip-10-0-129-85.ec2.internal\" DevicePath \"\"" Apr 22 18:54:01.519085 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:01.519078 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/74443004-09e1-4239-bfad-d99f7d8f4682-extensions-socket-volume\") on node \"ip-10-0-129-85.ec2.internal\" DevicePath \"\"" Apr 22 18:54:02.298820 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:02.298787 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74443004-09e1-4239-bfad-d99f7d8f4682" path="/var/lib/kubelet/pods/74443004-09e1-4239-bfad-d99f7d8f4682/volumes" Apr 22 18:54:02.299036 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:02.299023 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7303a9f-0f43-4a67-b004-63e89d607c77" path="/var/lib/kubelet/pods/e7303a9f-0f43-4a67-b004-63e89d607c77/volumes" Apr 22 18:54:02.328289 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:02.328264 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" Apr 22 18:54:02.332401 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:02.332375 2578 status_manager.go:895] "Failed to get status for pod" podUID="74443004-09e1-4239-bfad-d99f7d8f4682" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" err="pods \"kuadrant-operator-controller-manager-84b657d985-mk8xm\" is forbidden: User \"system:node:ip-10-0-129-85.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-85.ec2.internal' and this object" Apr 22 18:54:04.298749 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:04.298710 2578 status_manager.go:895] "Failed to get status for pod" podUID="74443004-09e1-4239-bfad-d99f7d8f4682" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-mk8xm" err="pods \"kuadrant-operator-controller-manager-84b657d985-mk8xm\" is forbidden: User \"system:node:ip-10-0-129-85.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-85.ec2.internal' and this object" Apr 22 18:54:12.330082 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:12.330055 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sdfvj" Apr 22 18:54:38.759083 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:38.759041 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:54:38.761280 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:38.761258 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" Apr 22 18:54:38.763912 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:38.763880 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 18:54:38.763912 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:38.763885 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-xztn6\"" Apr 22 18:54:38.772295 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:38.772267 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:54:38.798283 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:38.798254 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:54:38.909181 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:38.909141 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggvnd\" (UniqueName: \"kubernetes.io/projected/b588e6ac-7ca8-4cdb-9702-e144ab9674dd-kube-api-access-ggvnd\") pod \"limitador-limitador-78c99df468-rg9sg\" (UID: \"b588e6ac-7ca8-4cdb-9702-e144ab9674dd\") " pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" Apr 22 18:54:38.909342 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:38.909192 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b588e6ac-7ca8-4cdb-9702-e144ab9674dd-config-file\") pod \"limitador-limitador-78c99df468-rg9sg\" (UID: \"b588e6ac-7ca8-4cdb-9702-e144ab9674dd\") " pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" Apr 22 18:54:39.009866 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:39.009781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggvnd\" (UniqueName: \"kubernetes.io/projected/b588e6ac-7ca8-4cdb-9702-e144ab9674dd-kube-api-access-ggvnd\") pod \"limitador-limitador-78c99df468-rg9sg\" (UID: \"b588e6ac-7ca8-4cdb-9702-e144ab9674dd\") " pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" Apr 22 18:54:39.009866 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:39.009832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b588e6ac-7ca8-4cdb-9702-e144ab9674dd-config-file\") pod \"limitador-limitador-78c99df468-rg9sg\" (UID: \"b588e6ac-7ca8-4cdb-9702-e144ab9674dd\") " pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" Apr 22 18:54:39.010418 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:39.010401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b588e6ac-7ca8-4cdb-9702-e144ab9674dd-config-file\") pod \"limitador-limitador-78c99df468-rg9sg\" (UID: \"b588e6ac-7ca8-4cdb-9702-e144ab9674dd\") " pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" Apr 22 18:54:39.020436 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:39.020413 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggvnd\" (UniqueName: \"kubernetes.io/projected/b588e6ac-7ca8-4cdb-9702-e144ab9674dd-kube-api-access-ggvnd\") pod \"limitador-limitador-78c99df468-rg9sg\" (UID: \"b588e6ac-7ca8-4cdb-9702-e144ab9674dd\") " pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" Apr 22 18:54:39.072338 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:39.072309 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" Apr 22 18:54:39.415534 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:39.415454 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:54:39.419911 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:54:39.419876 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb588e6ac_7ca8_4cdb_9702_e144ab9674dd.slice/crio-02516052aac54a70ac02c89fc5004b7e2c399dde975db78fc5834dce21606b9c WatchSource:0}: Error finding container 02516052aac54a70ac02c89fc5004b7e2c399dde975db78fc5834dce21606b9c: Status 404 returned error can't find the container with id 02516052aac54a70ac02c89fc5004b7e2c399dde975db78fc5834dce21606b9c Apr 22 18:54:39.443743 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:39.443707 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" event={"ID":"b588e6ac-7ca8-4cdb-9702-e144ab9674dd","Type":"ContainerStarted","Data":"02516052aac54a70ac02c89fc5004b7e2c399dde975db78fc5834dce21606b9c"} Apr 22 18:54:42.454594 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:42.454555 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" event={"ID":"b588e6ac-7ca8-4cdb-9702-e144ab9674dd","Type":"ContainerStarted","Data":"5a20c861ebee2149c83520f32b6ce04506235c950a76a14f2d5dc12706ac98ea"} Apr 22 18:54:42.454970 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:42.454631 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" Apr 22 18:54:42.471988 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:42.471922 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" podStartSLOduration=2.060374222 podStartE2EDuration="4.471906272s" podCreationTimestamp="2026-04-22 18:54:38 +0000 UTC" firstStartedPulling="2026-04-22 18:54:39.421835866 +0000 UTC m=+745.664176243" lastFinishedPulling="2026-04-22 18:54:41.833367916 +0000 UTC m=+748.075708293" observedRunningTime="2026-04-22 18:54:42.471205622 +0000 UTC m=+748.713546025" watchObservedRunningTime="2026-04-22 18:54:42.471906272 +0000 UTC m=+748.714246676" Apr 22 18:54:53.462395 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:54:53.462366 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-rg9sg" Apr 22 18:55:20.299592 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:20.299563 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:55:51.422078 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.422040 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-59797bb5d8-4hkmj"] Apr 22 18:55:51.424615 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.424591 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-59797bb5d8-4hkmj" Apr 22 18:55:51.428415 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.428394 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-4fsbs\"" Apr 22 18:55:51.428415 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.428408 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 22 18:55:51.428597 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.428486 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 22 18:55:51.433266 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.433229 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-59797bb5d8-4hkmj"] Apr 22 18:55:51.467395 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.467363 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl8r6\" (UniqueName: \"kubernetes.io/projected/0a22f3b4-0d30-421c-b59d-9b8e01ab5a93-kube-api-access-hl8r6\") pod \"maas-api-59797bb5d8-4hkmj\" (UID: \"0a22f3b4-0d30-421c-b59d-9b8e01ab5a93\") " pod="opendatahub/maas-api-59797bb5d8-4hkmj" Apr 22 18:55:51.467558 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.467440 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0a22f3b4-0d30-421c-b59d-9b8e01ab5a93-maas-api-tls\") pod \"maas-api-59797bb5d8-4hkmj\" (UID: \"0a22f3b4-0d30-421c-b59d-9b8e01ab5a93\") " pod="opendatahub/maas-api-59797bb5d8-4hkmj" Apr 22 18:55:51.568379 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.568329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl8r6\" (UniqueName: \"kubernetes.io/projected/0a22f3b4-0d30-421c-b59d-9b8e01ab5a93-kube-api-access-hl8r6\") pod \"maas-api-59797bb5d8-4hkmj\" (UID: \"0a22f3b4-0d30-421c-b59d-9b8e01ab5a93\") " pod="opendatahub/maas-api-59797bb5d8-4hkmj" Apr 22 18:55:51.568578 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.568409 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0a22f3b4-0d30-421c-b59d-9b8e01ab5a93-maas-api-tls\") pod \"maas-api-59797bb5d8-4hkmj\" (UID: \"0a22f3b4-0d30-421c-b59d-9b8e01ab5a93\") " pod="opendatahub/maas-api-59797bb5d8-4hkmj" Apr 22 18:55:51.571035 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.571009 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/0a22f3b4-0d30-421c-b59d-9b8e01ab5a93-maas-api-tls\") pod \"maas-api-59797bb5d8-4hkmj\" (UID: \"0a22f3b4-0d30-421c-b59d-9b8e01ab5a93\") " pod="opendatahub/maas-api-59797bb5d8-4hkmj" Apr 22 18:55:51.578147 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.578099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl8r6\" (UniqueName: \"kubernetes.io/projected/0a22f3b4-0d30-421c-b59d-9b8e01ab5a93-kube-api-access-hl8r6\") pod \"maas-api-59797bb5d8-4hkmj\" (UID: \"0a22f3b4-0d30-421c-b59d-9b8e01ab5a93\") " pod="opendatahub/maas-api-59797bb5d8-4hkmj" Apr 22 18:55:51.736114 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.736078 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-59797bb5d8-4hkmj" Apr 22 18:55:51.865034 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:51.864910 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-59797bb5d8-4hkmj"] Apr 22 18:55:51.867759 ip-10-0-129-85 kubenswrapper[2578]: W0422 18:55:51.867726 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a22f3b4_0d30_421c_b59d_9b8e01ab5a93.slice/crio-169eaada57352ef6d913168f63c89f4efe00b15769392f6a5aee60bb5a9ee431 WatchSource:0}: Error finding container 169eaada57352ef6d913168f63c89f4efe00b15769392f6a5aee60bb5a9ee431: Status 404 returned error can't find the container with id 169eaada57352ef6d913168f63c89f4efe00b15769392f6a5aee60bb5a9ee431 Apr 22 18:55:52.685872 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:52.685820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-59797bb5d8-4hkmj" event={"ID":"0a22f3b4-0d30-421c-b59d-9b8e01ab5a93","Type":"ContainerStarted","Data":"169eaada57352ef6d913168f63c89f4efe00b15769392f6a5aee60bb5a9ee431"} Apr 22 18:55:54.695561 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:54.695514 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-59797bb5d8-4hkmj" event={"ID":"0a22f3b4-0d30-421c-b59d-9b8e01ab5a93","Type":"ContainerStarted","Data":"fb4120624128c4be3a789db56b75f0dd84383188917c616897b0c23ab66ba8ec"} Apr 22 18:55:54.696005 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:54.695763 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-59797bb5d8-4hkmj" Apr 22 18:55:54.712450 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:55:54.712408 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-59797bb5d8-4hkmj" podStartSLOduration=1.265310694 podStartE2EDuration="3.712391696s" podCreationTimestamp="2026-04-22 18:55:51 +0000 UTC" firstStartedPulling="2026-04-22 18:55:51.869001565 +0000 UTC m=+818.111341945" lastFinishedPulling="2026-04-22 18:55:54.316082568 +0000 UTC m=+820.558422947" observedRunningTime="2026-04-22 18:55:54.711384009 +0000 UTC m=+820.953724408" watchObservedRunningTime="2026-04-22 18:55:54.712391696 +0000 UTC m=+820.954732122" Apr 22 18:56:00.704076 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:56:00.704044 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-59797bb5d8-4hkmj" Apr 22 18:56:01.071109 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:56:01.071024 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:56:14.082671 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:56:14.082603 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:56:34.681856 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:56:34.681820 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:56:37.493250 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:56:37.493216 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:56:41.485787 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:56:41.485747 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:56:50.290189 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:56:50.290153 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:57:14.264862 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:57:14.264832 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 18:57:14.267553 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:57:14.267531 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 18:58:14.284668 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:58:14.284618 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:58:24.390459 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:58:24.390422 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:58:33.092563 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:58:33.092527 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:58:43.482472 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:58:43.482431 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:58:52.089620 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:58:52.089586 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 18:59:02.887042 ip-10-0-129-85 kubenswrapper[2578]: I0422 18:59:02.887002 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:00:06.179358 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:00:06.179281 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:00:21.787210 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:00:21.787172 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:01:00.082156 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:01:00.082117 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:01:16.786223 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:01:16.786185 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:01:32.286246 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:01:32.286211 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:01:48.102271 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:01:48.102234 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:01:52.496385 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:01:52.496350 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:02:13.881876 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:02:13.881835 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:02:14.289493 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:02:14.289461 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 19:02:14.293576 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:02:14.293535 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 19:02:17.986379 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:02:17.986342 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:02:40.277071 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:02:40.277025 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:02:49.544525 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:02:49.544491 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:03:05.897017 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:03:05.896973 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:03:14.430183 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:03:14.430150 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:03:31.389237 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:03:31.389199 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:03:39.493383 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:03:39.493338 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:04:12.384920 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:04:12.384882 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:04:20.982193 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:04:20.982153 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:04:29.683445 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:04:29.683364 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:04:37.687160 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:04:37.687125 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:04:45.882675 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:04:45.882621 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:05:03.383433 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:05:03.383389 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:05:13.078873 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:05:13.078834 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:06:00.989915 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:06:00.989827 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:06:09.108386 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:06:09.108349 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:06:18.086488 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:06:18.086455 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:06:26.787483 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:06:26.787447 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:06:35.406680 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:06:35.406615 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:06:43.903952 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:06:43.903912 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:06:52.892269 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:06:52.892230 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:06:58.183975 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:06:58.183941 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:07:02.078487 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:07:02.078449 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:07:11.783870 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:07:11.783833 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:07:14.310357 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:07:14.310330 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 19:07:14.315469 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:07:14.315449 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 19:07:19.683495 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:07:19.683452 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:07:28.992474 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:07:28.992384 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:07:37.390270 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:07:37.390233 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:07:46.292343 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:07:46.292303 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:07:54.398914 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:07:54.398881 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:08:03.301788 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:08:03.301746 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:08:11.090489 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:08:11.090454 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:08:20.197214 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:08:20.197181 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:08:29.390293 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:08:29.390227 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:09:12.891529 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:09:12.891497 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:09:18.790482 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:09:18.790447 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:10:49.646976 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:10:49.646887 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:10:54.935530 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:10:54.935493 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:11:19.339180 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:11:19.339142 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:11:24.440335 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:11:24.440301 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:11:34.045179 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:11:34.045139 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:11:43.734042 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:11:43.734005 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:11:53.636597 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:11:53.636565 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:12:03.935293 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:12:03.935216 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:12:13.441066 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:12:13.441028 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:12:14.330958 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:12:14.330929 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 19:12:14.336560 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:12:14.336538 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 19:12:23.333878 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:12:23.333839 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:12:32.377242 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:12:32.377202 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:12:42.941358 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:12:42.941316 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:12:52.148177 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:12:52.148138 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:13:26.435212 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:13:26.435180 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:14:09.934327 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:14:09.934290 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:14:18.037966 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:14:18.037933 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:14:27.337802 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:14:27.337769 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:14:35.837311 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:14:35.837273 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:14:43.751969 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:14:43.751926 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:14:57.241736 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:14:57.241702 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:15:05.334180 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:15:05.334086 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:15:13.448783 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:15:13.448749 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:15:22.146275 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:15:22.146228 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:15:30.381071 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:15:30.381034 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:15:38.945165 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:15:38.945125 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:15:49.348311 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:15:49.348279 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:16:07.551084 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:16:07.551044 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:16:15.746943 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:16:15.746911 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:16:24.657477 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:16:24.657442 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:16:32.266124 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:16:32.266085 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:16:49.744050 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:16:49.744016 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:16:58.138487 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:16:58.138453 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:17:06.843482 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:17:06.843444 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:17:14.353657 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:17:14.353614 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 19:17:14.357513 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:17:14.357483 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4r9_2a0c68c2-fd6c-49b2-bf04-84096034153e/ovn-acl-logging/0.log" Apr 22 19:17:14.437674 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:17:14.437623 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:17:24.383516 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:17:24.383485 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:17:32.441157 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:17:32.441119 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:17:41.137022 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:17:41.136987 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:17:52.739804 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:17:52.739762 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:18:01.857600 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:18:01.857521 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:18:15.244660 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:18:15.244593 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:18:24.245060 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:18:24.245022 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:18:30.563725 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:18:30.563684 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:18:35.941482 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:18:35.941441 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:18:40.158754 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:18:40.158718 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:18:49.838205 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:18:49.838163 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:19:07.738034 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:19:07.737998 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:19:15.829730 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:19:15.829686 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:19:24.133408 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:19:24.133366 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:19:31.594663 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:19:31.594554 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:19:56.034204 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:19:56.034160 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:20:08.441500 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:08.441458 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rg9sg"] Apr 22 19:20:14.932176 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:14.932143 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-xjx2g_e3890bca-d60a-440d-8bb4-506cbe672756/manager/0.log" Apr 22 19:20:15.067289 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:15.067259 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-59797bb5d8-4hkmj_0a22f3b4-0d30-421c-b59d-9b8e01ab5a93/maas-api/0.log" Apr 22 19:20:15.321740 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:15.321661 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-n2r78_1bdd73ee-82ea-4bf9-9629-952f519692d3/manager/2.log" Apr 22 19:20:15.438741 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:15.438714 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-dd89cc56c-5spxh_3c9f2586-5824-4e5d-ad14-46003a8242ac/manager/0.log" Apr 22 19:20:17.335994 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:17.335967 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9l6dw_7796bb22-56d3-4eb5-a146-cced15299212/manager/0.log" Apr 22 19:20:17.584141 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:17.584116 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-wz2bw_d899368d-1ef9-4808-a89b-c3541844e345/registry-server/0.log" Apr 22 19:20:17.847171 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:17.847143 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-rg9sg_b588e6ac-7ca8-4cdb-9702-e144ab9674dd/limitador/0.log" Apr 22 19:20:17.958979 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:17.958946 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-sdfvj_fc6d109a-4012-4df2-9871-766aed0933c1/manager/0.log" Apr 22 19:20:18.421274 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:18.421244 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-ps5kk_4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa/discovery/0.log" Apr 22 19:20:18.637109 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:18.637080 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7cff94f675-n926r_411946d3-1e28-4c55-85a3-640dbbfbed40/kube-auth-proxy/0.log" Apr 22 19:20:23.397015 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.396979 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tz8b7/must-gather-7ht5s"] Apr 22 19:20:23.400176 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.400157 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tz8b7/must-gather-7ht5s" Apr 22 19:20:23.403045 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.403018 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tz8b7\"/\"kube-root-ca.crt\"" Apr 22 19:20:23.403169 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.403047 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tz8b7\"/\"openshift-service-ca.crt\"" Apr 22 19:20:23.403169 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.403072 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tz8b7\"/\"default-dockercfg-7p5tx\"" Apr 22 19:20:23.415332 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.415292 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tz8b7/must-gather-7ht5s"] Apr 22 19:20:23.467002 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.466972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/983ef7a9-c787-4dca-9e5f-c905967c8466-must-gather-output\") pod \"must-gather-7ht5s\" (UID: \"983ef7a9-c787-4dca-9e5f-c905967c8466\") " pod="openshift-must-gather-tz8b7/must-gather-7ht5s" Apr 22 19:20:23.467172 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.467054 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c69cv\" (UniqueName: \"kubernetes.io/projected/983ef7a9-c787-4dca-9e5f-c905967c8466-kube-api-access-c69cv\") pod \"must-gather-7ht5s\" (UID: \"983ef7a9-c787-4dca-9e5f-c905967c8466\") " pod="openshift-must-gather-tz8b7/must-gather-7ht5s" Apr 22 19:20:23.568422 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.568389 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/983ef7a9-c787-4dca-9e5f-c905967c8466-must-gather-output\") pod \"must-gather-7ht5s\" (UID: \"983ef7a9-c787-4dca-9e5f-c905967c8466\") " pod="openshift-must-gather-tz8b7/must-gather-7ht5s" Apr 22 19:20:23.568600 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.568481 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c69cv\" (UniqueName: \"kubernetes.io/projected/983ef7a9-c787-4dca-9e5f-c905967c8466-kube-api-access-c69cv\") pod \"must-gather-7ht5s\" (UID: \"983ef7a9-c787-4dca-9e5f-c905967c8466\") " pod="openshift-must-gather-tz8b7/must-gather-7ht5s" Apr 22 19:20:23.568780 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.568757 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/983ef7a9-c787-4dca-9e5f-c905967c8466-must-gather-output\") pod \"must-gather-7ht5s\" (UID: \"983ef7a9-c787-4dca-9e5f-c905967c8466\") " pod="openshift-must-gather-tz8b7/must-gather-7ht5s" Apr 22 19:20:23.578291 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.578269 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c69cv\" (UniqueName: \"kubernetes.io/projected/983ef7a9-c787-4dca-9e5f-c905967c8466-kube-api-access-c69cv\") pod \"must-gather-7ht5s\" (UID: \"983ef7a9-c787-4dca-9e5f-c905967c8466\") " pod="openshift-must-gather-tz8b7/must-gather-7ht5s" Apr 22 19:20:23.709624 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.709588 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tz8b7/must-gather-7ht5s" Apr 22 19:20:23.832471 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.832440 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tz8b7/must-gather-7ht5s"] Apr 22 19:20:23.835406 ip-10-0-129-85 kubenswrapper[2578]: W0422 19:20:23.835374 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod983ef7a9_c787_4dca_9e5f_c905967c8466.slice/crio-4045850f111b2023d977dd125b6fe1f70fcd6bdc2a2181acd08eb290f4b4a9e1 WatchSource:0}: Error finding container 4045850f111b2023d977dd125b6fe1f70fcd6bdc2a2181acd08eb290f4b4a9e1: Status 404 returned error can't find the container with id 4045850f111b2023d977dd125b6fe1f70fcd6bdc2a2181acd08eb290f4b4a9e1 Apr 22 19:20:23.837076 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:23.837059 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:20:24.446572 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:24.446532 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tz8b7/must-gather-7ht5s" event={"ID":"983ef7a9-c787-4dca-9e5f-c905967c8466","Type":"ContainerStarted","Data":"4045850f111b2023d977dd125b6fe1f70fcd6bdc2a2181acd08eb290f4b4a9e1"} Apr 22 19:20:25.453132 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:25.453082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tz8b7/must-gather-7ht5s" event={"ID":"983ef7a9-c787-4dca-9e5f-c905967c8466","Type":"ContainerStarted","Data":"653aec8c80ef16bdfaeb5b5e8306ae665e3bded576f2662f95cc1dcbce1cb471"} Apr 22 19:20:25.453625 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:25.453139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tz8b7/must-gather-7ht5s" event={"ID":"983ef7a9-c787-4dca-9e5f-c905967c8466","Type":"ContainerStarted","Data":"90876b7768b05d067d4da05d0637996004715c58b8845945d2438440576833c9"} Apr 22 19:20:25.477977 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:25.477920 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tz8b7/must-gather-7ht5s" podStartSLOduration=1.692511337 podStartE2EDuration="2.477902789s" podCreationTimestamp="2026-04-22 19:20:23 +0000 UTC" firstStartedPulling="2026-04-22 19:20:23.837216102 +0000 UTC m=+2290.079556479" lastFinishedPulling="2026-04-22 19:20:24.622607551 +0000 UTC m=+2290.864947931" observedRunningTime="2026-04-22 19:20:25.475723585 +0000 UTC m=+2291.718063987" watchObservedRunningTime="2026-04-22 19:20:25.477902789 +0000 UTC m=+2291.720243191" Apr 22 19:20:26.485731 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:26.485697 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tdb8c_83ef0e88-f056-4378-862a-0cd9fcd367d1/global-pull-secret-syncer/0.log" Apr 22 19:20:26.583094 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:26.583052 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7n89x_3f34c1a3-3713-45c0-b770-cec5862c620d/konnectivity-agent/0.log" Apr 22 19:20:26.656914 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:26.656885 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-85.ec2.internal_79daf747565244a11b1a61e38cc6d0df/haproxy/0.log" Apr 22 19:20:31.448526 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:31.448497 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9l6dw_7796bb22-56d3-4eb5-a146-cced15299212/manager/0.log" Apr 22 19:20:31.549795 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:31.549761 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-wz2bw_d899368d-1ef9-4808-a89b-c3541844e345/registry-server/0.log" Apr 22 19:20:31.680866 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:31.680830 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-rg9sg_b588e6ac-7ca8-4cdb-9702-e144ab9674dd/limitador/0.log" Apr 22 19:20:31.768382 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:31.768290 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-sdfvj_fc6d109a-4012-4df2-9871-766aed0933c1/manager/0.log" Apr 22 19:20:33.610344 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:33.610313 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmsbg_ce009679-a885-4ba1-a31d-8658c5ba82eb/node-exporter/0.log" Apr 22 19:20:33.639939 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:33.639909 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmsbg_ce009679-a885-4ba1-a31d-8658c5ba82eb/kube-rbac-proxy/0.log" Apr 22 19:20:33.666567 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:33.666529 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dmsbg_ce009679-a885-4ba1-a31d-8658c5ba82eb/init-textfile/0.log" Apr 22 19:20:34.582987 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.582949 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx"] Apr 22 19:20:34.586716 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.586692 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.596631 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.596601 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx"] Apr 22 19:20:34.680323 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.680287 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b98b911f-4efb-4421-b6ab-16e6e23f25cd-sys\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.680827 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.680344 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kjrm\" (UniqueName: \"kubernetes.io/projected/b98b911f-4efb-4421-b6ab-16e6e23f25cd-kube-api-access-5kjrm\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.680827 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.680385 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b98b911f-4efb-4421-b6ab-16e6e23f25cd-lib-modules\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.680827 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.680454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b98b911f-4efb-4421-b6ab-16e6e23f25cd-podres\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.680827 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.680528 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b98b911f-4efb-4421-b6ab-16e6e23f25cd-proc\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.781690 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.781631 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b98b911f-4efb-4421-b6ab-16e6e23f25cd-proc\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.781690 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.781693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b98b911f-4efb-4421-b6ab-16e6e23f25cd-sys\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.781931 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.781736 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kjrm\" (UniqueName: \"kubernetes.io/projected/b98b911f-4efb-4421-b6ab-16e6e23f25cd-kube-api-access-5kjrm\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.781931 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.781755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b98b911f-4efb-4421-b6ab-16e6e23f25cd-proc\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.781931 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.781772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b98b911f-4efb-4421-b6ab-16e6e23f25cd-lib-modules\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.781931 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.781818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b98b911f-4efb-4421-b6ab-16e6e23f25cd-podres\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.781931 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.781834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b98b911f-4efb-4421-b6ab-16e6e23f25cd-sys\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.782177 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.781939 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b98b911f-4efb-4421-b6ab-16e6e23f25cd-podres\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.782177 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.781940 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b98b911f-4efb-4421-b6ab-16e6e23f25cd-lib-modules\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.792269 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.792236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kjrm\" (UniqueName: \"kubernetes.io/projected/b98b911f-4efb-4421-b6ab-16e6e23f25cd-kube-api-access-5kjrm\") pod \"perf-node-gather-daemonset-bg2dx\" (UID: \"b98b911f-4efb-4421-b6ab-16e6e23f25cd\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:34.899266 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:34.898802 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:35.054891 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:35.054868 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx"] Apr 22 19:20:35.058225 ip-10-0-129-85 kubenswrapper[2578]: W0422 19:20:35.058188 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb98b911f_4efb_4421_b6ab_16e6e23f25cd.slice/crio-8341d4cfe9dc88b01cc19a64491deb4c38ef5fb4facc1f1132e5f88d80b927ac WatchSource:0}: Error finding container 8341d4cfe9dc88b01cc19a64491deb4c38ef5fb4facc1f1132e5f88d80b927ac: Status 404 returned error can't find the container with id 8341d4cfe9dc88b01cc19a64491deb4c38ef5fb4facc1f1132e5f88d80b927ac Apr 22 19:20:35.370516 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:35.370484 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-qfncj_505044c7-242a-4290-ab95-7e778348f684/networking-console-plugin/0.log" Apr 22 19:20:35.496854 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:35.496772 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" event={"ID":"b98b911f-4efb-4421-b6ab-16e6e23f25cd","Type":"ContainerStarted","Data":"8b9ebdda82e9be56f7a4cd055fd0853561e5bf58afaa7eee261a192eb4fbd59d"} Apr 22 19:20:35.496854 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:35.496821 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" event={"ID":"b98b911f-4efb-4421-b6ab-16e6e23f25cd","Type":"ContainerStarted","Data":"8341d4cfe9dc88b01cc19a64491deb4c38ef5fb4facc1f1132e5f88d80b927ac"} Apr 22 19:20:35.497036 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:35.496860 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:35.514084 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:35.514028 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" podStartSLOduration=1.514008875 podStartE2EDuration="1.514008875s" podCreationTimestamp="2026-04-22 19:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:20:35.511489615 +0000 UTC m=+2301.753830015" watchObservedRunningTime="2026-04-22 19:20:35.514008875 +0000 UTC m=+2301.756349273" Apr 22 19:20:37.631897 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:37.631868 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gtx4b_cbe11d10-5c49-495a-8459-b9d0af0389ae/dns/0.log" Apr 22 19:20:37.658820 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:37.658796 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gtx4b_cbe11d10-5c49-495a-8459-b9d0af0389ae/kube-rbac-proxy/0.log" Apr 22 19:20:37.763069 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:37.763043 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b5q9v_b4a78812-3844-4d64-b8b5-016856db881b/dns-node-resolver/0.log" Apr 22 19:20:38.299789 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:38.299760 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-67qqz_6a3096c0-28bb-48d6-a4f9-f3fc9bf195d8/node-ca/0.log" Apr 22 19:20:39.271988 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:39.271958 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-ps5kk_4c6b2d02-cb8b-45e2-961f-7d4f5e2658aa/discovery/0.log" Apr 22 19:20:39.320187 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:39.320159 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7cff94f675-n926r_411946d3-1e28-4c55-85a3-640dbbfbed40/kube-auth-proxy/0.log" Apr 22 19:20:39.987582 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:39.987552 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-nmmxl_cd714644-718d-4a14-9b70-5b3aa5980856/serve-healthcheck-canary/0.log" Apr 22 19:20:40.529859 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:40.529831 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d52x2_0e3c2ff2-cc57-4683-bd7f-2538ffeb7788/kube-rbac-proxy/0.log" Apr 22 19:20:40.551322 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:40.551297 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d52x2_0e3c2ff2-cc57-4683-bd7f-2538ffeb7788/exporter/0.log" Apr 22 19:20:40.573360 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:40.573339 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d52x2_0e3c2ff2-cc57-4683-bd7f-2538ffeb7788/extractor/0.log" Apr 22 19:20:41.510445 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:41.510416 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-bg2dx" Apr 22 19:20:42.564473 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:42.564434 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-xjx2g_e3890bca-d60a-440d-8bb4-506cbe672756/manager/0.log" Apr 22 19:20:42.601572 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:42.601538 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-59797bb5d8-4hkmj_0a22f3b4-0d30-421c-b59d-9b8e01ab5a93/maas-api/0.log" Apr 22 19:20:42.701363 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:42.701322 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-n2r78_1bdd73ee-82ea-4bf9-9629-952f519692d3/manager/1.log" Apr 22 19:20:42.714999 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:42.714956 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-n2r78_1bdd73ee-82ea-4bf9-9629-952f519692d3/manager/2.log" Apr 22 19:20:42.757411 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:42.757374 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-dd89cc56c-5spxh_3c9f2586-5824-4e5d-ad14-46003a8242ac/manager/0.log" Apr 22 19:20:44.234450 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:44.234415 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-wv888_20c3eb36-cee9-45e7-802a-8362efab2dbd/openshift-lws-operator/0.log" Apr 22 19:20:50.396977 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:50.396949 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c4hqs_ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68/kube-multus-additional-cni-plugins/0.log" Apr 22 19:20:50.416737 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:50.416708 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c4hqs_ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68/egress-router-binary-copy/0.log" Apr 22 19:20:50.436616 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:50.436585 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c4hqs_ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68/cni-plugins/0.log" Apr 22 19:20:50.456795 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:50.456763 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c4hqs_ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68/bond-cni-plugin/0.log" Apr 22 19:20:50.477089 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:50.477065 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c4hqs_ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68/routeoverride-cni/0.log" Apr 22 19:20:50.498129 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:50.498105 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c4hqs_ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68/whereabouts-cni-bincopy/0.log" Apr 22 19:20:50.518012 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:50.517986 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c4hqs_ae80bd70-7d80-4b8b-a4e9-5728e1e5fe68/whereabouts-cni/0.log" Apr 22 19:20:50.611582 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:50.611542 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x5q6c_c557042d-06c7-4315-8e35-0884cd906ef9/kube-multus/0.log" Apr 22 19:20:50.675619 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:50.675600 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l6jrz_eaf5856d-69b7-4e87-b07c-f8ea8eed1048/network-metrics-daemon/0.log" Apr 22 19:20:50.714456 ip-10-0-129-85 kubenswrapper[2578]: I0422 19:20:50.714425 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l6jrz_eaf5856d-69b7-4e87-b07c-f8ea8eed1048/kube-rbac-proxy/0.log"