Apr 22 18:43:24.695463 ip-10-0-135-106 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:43:25.077079 ip-10-0-135-106 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:43:25.077079 ip-10-0-135-106 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:43:25.077079 ip-10-0-135-106 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:43:25.077079 ip-10-0-135-106 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:43:25.077079 ip-10-0-135-106 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:43:25.080039 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.079948 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:43:25.084882 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084867 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:25.084882 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084882 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084886 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084890 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084893 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084896 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084906 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084909 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084913 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084915 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084918 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084921 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084924 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084926 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084929 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084932 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084934 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084937 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084939 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084942 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084944 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084947 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:25.084946 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084950 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084953 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084955 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084958 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084963 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084972 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084976 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084979 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084982 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084985 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084988 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084990 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084993 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084996 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.084999 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085001 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085006 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085008 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085011 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:25.085490 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085014 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085016 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085019 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085022 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085025 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085027 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085029 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085032 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085034 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085037 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085040 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085042 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085044 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085047 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085050 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085053 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085056 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085059 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085061 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:25.085948 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085066 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085069 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085072 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085075 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085077 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085080 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085083 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085086 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085089 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085093 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085095 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085098 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085101 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085103 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085106 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085108 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085111 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085113 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085116 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:25.086429 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085119 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085122 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085124 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085127 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085129 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085132 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085134 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085574 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085580 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085584 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085588 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085590 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085593 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085601 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085604 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085607 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085610 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085612 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085615 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085618 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:25.086987 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085622 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085626 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085630 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085633 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085636 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085639 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085642 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085644 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085647 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085649 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085652 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085654 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085657 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085659 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085662 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085664 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085666 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085669 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085672 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085674 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:25.087485 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085678 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085681 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085683 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085685 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085688 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085690 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085693 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085696 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085698 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085701 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085703 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085706 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085708 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085711 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085713 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085715 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085718 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085721 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085723 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085725 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:25.087991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085728 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085730 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085732 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085735 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085738 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085740 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085744 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085747 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085749 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085753 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085755 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085758 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085760 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085763 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085766 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085768 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085771 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085775 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085777 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085780 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:25.088507 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085783 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085785 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085787 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085790 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085792 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085795 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085797 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085800 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085802 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085805 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085807 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085810 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.085812 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086800 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086810 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086818 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086823 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086828 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086831 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086836 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:43:25.088991 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086841 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086845 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086848 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086852 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086855 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086859 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086862 2578 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086866 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086869 2578 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086871 2578 flags.go:64] FLAG: --cloud-config="" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086874 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086877 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086890 2578 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086894 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086897 2578 flags.go:64] FLAG: --config-dir="" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086900 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086904 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086908 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086911 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086914 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086917 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086920 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086923 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086926 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086929 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:43:25.089487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086932 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086937 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086940 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086943 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086946 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086949 2578 flags.go:64] FLAG: --enable-server="true" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086952 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086962 2578 flags.go:64] FLAG: --event-burst="100" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086965 2578 flags.go:64] FLAG: --event-qps="50" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086968 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086971 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086974 2578 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086978 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086981 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086984 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086987 2578 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086990 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086994 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.086997 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087000 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087008 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087012 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087014 2578 flags.go:64] FLAG: --feature-gates="" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087018 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087021 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:43:25.090077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087025 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087028 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087031 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087034 2578 flags.go:64] FLAG: --help="false" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087037 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087040 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087043 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087046 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087049 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087053 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087056 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087059 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087062 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087065 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087068 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087071 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087074 2578 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087077 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087080 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087083 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087085 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087088 2578 flags.go:64] FLAG: --lock-file="" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087091 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087094 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:43:25.090694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087097 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087103 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087106 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087108 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087116 2578 flags.go:64] FLAG: --logging-format="text" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087120 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087123 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087126 2578 flags.go:64] FLAG: --manifest-url="" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087129 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087133 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087136 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087140 2578 flags.go:64] FLAG: --max-pods="110" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087143 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087146 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087149 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087152 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087155 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087158 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087161 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087169 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087192 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087197 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087201 2578 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:43:25.091286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087206 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087219 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087222 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087225 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087228 2578 flags.go:64] FLAG: --port="10250" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087232 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087235 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00948474cb287b708" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087238 2578 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087241 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087244 2578 flags.go:64] FLAG: --register-node="true" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087247 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087250 2578 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087254 2578 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087257 2578 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087261 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087264 2578 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087267 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087270 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087273 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087276 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087279 2578 flags.go:64] FLAG: --runonce="false" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087282 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087285 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087288 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087291 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087294 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:43:25.091879 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087297 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087300 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087303 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087306 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087309 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087312 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087315 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087318 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087321 2578 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087324 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087330 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087333 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087337 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087343 2578 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087346 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087349 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087352 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087355 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087359 2578 flags.go:64] FLAG: --v="2" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087363 2578 flags.go:64] FLAG: --version="false" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087367 2578 flags.go:64] FLAG: --vmodule="" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087372 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.087375 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087483 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087487 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:25.092523 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087490 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087494 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087497 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087499 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087502 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087505 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087507 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087510 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087512 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087514 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087517 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087519 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087522 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087526 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087528 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087531 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087533 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087536 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087538 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087541 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:25.093117 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087543 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087545 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087548 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087550 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087553 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087555 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087558 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087560 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087563 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087566 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087569 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087571 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087578 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087581 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087583 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087586 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087589 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087591 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087594 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087597 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:25.093779 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087599 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087602 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087605 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087607 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087610 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087613 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087615 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087618 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087620 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087623 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087625 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087628 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087630 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087633 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087636 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087638 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087641 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087643 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087646 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087649 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:25.094663 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087651 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087655 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087660 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087664 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087670 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087673 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087676 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087678 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087681 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087684 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087687 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087690 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087692 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087695 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087698 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087700 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087703 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087705 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087708 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:25.095543 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087710 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:25.096073 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087713 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:25.096073 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087718 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:25.096073 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087721 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:25.096073 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.087724 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:25.096073 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.088228 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:43:25.096298 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.096275 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:43:25.096350 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.096302 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:43:25.096398 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096380 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:25.096398 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096388 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:25.096398 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096393 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096399 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096404 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096410 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096414 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096421 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096426 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096430 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096435 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096439 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096443 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096447 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096450 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096454 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096458 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096462 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096467 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096471 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096476 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096480 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:25.096534 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096484 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096488 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096492 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096496 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096500 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096504 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096508 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096513 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096517 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096521 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096525 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096529 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096533 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096537 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096542 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096545 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096552 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096558 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096564 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:25.097415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096568 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096573 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096577 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096581 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096585 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096589 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096595 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096601 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096606 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096610 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096614 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096619 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096623 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096627 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096631 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096635 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096641 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096645 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096650 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:25.097947 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096655 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096659 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096663 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096667 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096671 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096675 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096679 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096683 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096688 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096692 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096696 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096700 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096704 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096710 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096714 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096718 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096722 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096727 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096731 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096735 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:25.098506 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096739 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096743 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096747 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096751 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096756 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096760 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.096767 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096936 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096943 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096949 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096954 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096958 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096962 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096967 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096971 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096976 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:25.099160 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096981 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096985 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096989 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096994 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.096999 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097003 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097008 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097012 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097016 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097021 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097026 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097030 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097035 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097039 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097044 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097048 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097052 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097056 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097060 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097064 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:25.099855 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097068 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097072 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097076 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097080 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097085 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097089 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097093 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097098 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097102 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097106 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097110 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097142 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097147 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097152 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097157 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097162 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097169 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097195 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097201 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:25.100699 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097205 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097209 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097214 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097219 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097224 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097229 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097233 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097237 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097241 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097245 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097249 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097253 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097257 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097262 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097266 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097270 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097274 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097281 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097286 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097291 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:25.101242 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097295 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097300 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097304 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097308 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097313 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097317 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097321 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097325 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097329 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097333 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097337 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097342 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097346 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097350 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097354 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097358 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097362 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:25.101732 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:25.097367 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:25.102126 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.097375 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:43:25.102126 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.098200 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:43:25.102126 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.100843 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:43:25.102126 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.101697 2578 server.go:1019] "Starting client certificate rotation" Apr 22 18:43:25.102126 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.101795 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:43:25.102415 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.102403 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:43:25.128123 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.128095 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:43:25.132224 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.132202 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:43:25.147912 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.147893 2578 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:43:25.152859 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.152845 2578 log.go:25] "Validated CRI v1 image API" Apr 22 18:43:25.154763 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.154749 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:43:25.157783 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.157764 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b252a3de-9c70-42e9-b026-b66bf17dc4fe:/dev/nvme0n1p3 d155f693-aeb4-4cea-b421-848a17e0bd84:/dev/nvme0n1p4] Apr 22 18:43:25.157839 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.157782 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:43:25.163488 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.163375 2578 manager.go:217] Machine: {Timestamp:2026-04-22 18:43:25.161800043 +0000 UTC m=+0.363520908 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100388 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27d87b7028eee7007a3e1d57b625b5 SystemUUID:ec27d87b-7028-eee7-007a-3e1d57b625b5 BootID:de726dcc-4c31-4045-90a5-172c0066296f Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c7:3d:ed:66:23 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c7:3d:ed:66:23 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:45:90:ec:42:a3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:43:25.163973 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.163961 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:43:25.164086 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.164074 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:43:25.165786 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.165764 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:43:25.165914 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.165789 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-106.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:43:25.165961 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.165924 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:43:25.165961 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.165933 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:43:25.165961 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.165946 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:43:25.167211 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.167198 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:43:25.168562 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.168550 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:43:25.168683 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.168674 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:43:25.170378 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.170357 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:43:25.170647 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.170633 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:43:25.170647 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.170649 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:43:25.170757 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.170664 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:43:25.170757 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.170673 2578 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:43:25.170757 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.170685 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:43:25.171829 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.171816 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:43:25.171874 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.171838 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:43:25.174372 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.174357 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:43:25.175538 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.175526 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:43:25.176957 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.176946 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:43:25.176994 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.176964 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:43:25.176994 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.176971 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:43:25.176994 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.176976 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:43:25.176994 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.176981 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:43:25.176994 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.176987 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:43:25.176994 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.176993 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:43:25.177148 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.177001 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:43:25.177148 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.177010 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:43:25.177148 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.177017 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:43:25.177148 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.177025 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:43:25.177148 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.177035 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:43:25.177148 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.177066 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:43:25.177148 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.177072 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:43:25.180719 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.180707 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:43:25.180774 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.180742 2578 server.go:1295] "Started kubelet" Apr 22 18:43:25.180838 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.180817 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:43:25.180910 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.180846 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:43:25.180964 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.180940 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:43:25.181633 ip-10-0-135-106 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:43:25.182347 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.182164 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:43:25.183145 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.183131 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:43:25.186086 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.186064 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-106.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:43:25.186214 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.186094 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-106.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:43:25.186260 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.186240 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:43:25.188349 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.188332 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:43:25.188802 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.188784 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:43:25.189200 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.189161 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:43:25.189774 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.189755 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:43:25.189774 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.189768 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:43:25.189922 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.189782 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:43:25.189922 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.189864 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:43:25.189922 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.189870 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:43:25.190051 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.189943 2578 factory.go:55] Registering systemd factory Apr 22 18:43:25.190051 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.189958 2578 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:43:25.190051 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.190012 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:25.190193 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.190161 2578 factory.go:153] Registering CRI-O factory Apr 22 18:43:25.190240 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.190195 2578 factory.go:223] Registration of the crio container factory successfully Apr 22 18:43:25.190282 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.190271 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:43:25.190321 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.190295 2578 factory.go:103] Registering Raw factory Apr 22 18:43:25.190321 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.190305 2578 manager.go:1196] Started watching for new ooms in manager Apr 22 18:43:25.190863 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.190848 2578 manager.go:319] Starting recovery of all containers Apr 22 18:43:25.204107 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.204085 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:43:25.204376 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.204364 2578 manager.go:324] Recovery completed Apr 22 18:43:25.204960 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.204075 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-106.ec2.internal.18a8c20aed154f78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-106.ec2.internal,UID:ip-10-0-135-106.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-106.ec2.internal,},FirstTimestamp:2026-04-22 18:43:25.180718968 +0000 UTC m=+0.382439827,LastTimestamp:2026-04-22 18:43:25.180718968 +0000 UTC m=+0.382439827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-106.ec2.internal,}" Apr 22 18:43:25.205870 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.205851 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-106.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:43:25.208434 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.208419 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:25.210810 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.210793 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:25.210886 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.210830 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:25.210886 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.210845 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:25.211351 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.211335 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:43:25.211351 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.211350 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:43:25.211452 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.211364 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:43:25.214443 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.214431 2578 policy_none.go:49] "None policy: Start" Apr 22 18:43:25.214488 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.214448 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:43:25.214488 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.214457 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:43:25.215376 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.214935 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-106.ec2.internal.18a8c20aeee07b90 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-106.ec2.internal,UID:ip-10-0-135-106.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-106.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-106.ec2.internal,},FirstTimestamp:2026-04-22 18:43:25.21081128 +0000 UTC m=+0.412532143,LastTimestamp:2026-04-22 18:43:25.21081128 +0000 UTC m=+0.412532143,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-106.ec2.internal,}" Apr 22 18:43:25.237790 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.237708 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-106.ec2.internal.18a8c20aeee0df85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-106.ec2.internal,UID:ip-10-0-135-106.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-135-106.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-135-106.ec2.internal,},FirstTimestamp:2026-04-22 18:43:25.210836869 +0000 UTC m=+0.412557731,LastTimestamp:2026-04-22 18:43:25.210836869 +0000 UTC m=+0.412557731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-106.ec2.internal,}" Apr 22 18:43:25.253739 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.253723 2578 manager.go:341] "Starting Device Plugin manager" Apr 22 18:43:25.274187 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.253798 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:43:25.274187 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.253813 2578 server.go:85] "Starting device plugin registration server" Apr 22 18:43:25.274187 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.254035 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:43:25.274187 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.254048 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:43:25.274187 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.254143 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:43:25.274187 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.254230 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:43:25.274187 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.254240 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:43:25.274187 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.254730 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:43:25.274187 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.254761 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:25.274187 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.263883 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-106.ec2.internal.18a8c20aeee116fb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-106.ec2.internal,UID:ip-10-0-135-106.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-135-106.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-135-106.ec2.internal,},FirstTimestamp:2026-04-22 18:43:25.210851067 +0000 UTC m=+0.412571929,LastTimestamp:2026-04-22 18:43:25.210851067 +0000 UTC m=+0.412571929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-106.ec2.internal,}" Apr 22 18:43:25.278599 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.278521 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-106.ec2.internal.18a8c20af193e80a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-106.ec2.internal,UID:ip-10-0-135-106.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-135-106.ec2.internal,},FirstTimestamp:2026-04-22 18:43:25.256124426 +0000 UTC m=+0.457845276,LastTimestamp:2026-04-22 18:43:25.256124426 +0000 UTC m=+0.457845276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-106.ec2.internal,}" Apr 22 18:43:25.303107 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.303084 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zknt8" Apr 22 18:43:25.311989 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.311968 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zknt8" Apr 22 18:43:25.322677 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.322654 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:43:25.324039 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.324022 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:43:25.324111 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.324053 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:43:25.324111 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.324074 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:43:25.324111 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.324080 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:43:25.324265 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.324115 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:43:25.335932 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.335873 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:25.354802 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.354773 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:25.355623 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.355603 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:25.355706 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.355631 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:25.355706 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.355643 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:25.355706 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.355666 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.365739 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.365724 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.365782 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.365746 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-106.ec2.internal\": node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:25.413446 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.413421 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:25.424783 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.424764 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal"] Apr 22 18:43:25.424835 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.424828 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:25.425564 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.425550 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:25.425633 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.425577 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:25.425633 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.425588 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:25.427948 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.427936 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:25.428065 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.428052 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.428099 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.428079 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:25.428613 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.428589 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:25.428613 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.428613 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:25.428739 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.428625 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:25.428739 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.428595 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:25.428739 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.428679 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:25.428739 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.428688 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:25.431487 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.431467 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.431557 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.431502 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:25.432165 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.432151 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:25.432242 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.432191 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:25.432242 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.432203 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:25.451705 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.451687 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-106.ec2.internal\" not found" node="ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.456137 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.456124 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-106.ec2.internal\" not found" node="ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.491801 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.491784 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d71cd7329eab1e79f952bebf6a7f77b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal\" (UID: \"8d71cd7329eab1e79f952bebf6a7f77b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.491849 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.491814 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d71cd7329eab1e79f952bebf6a7f77b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal\" (UID: \"8d71cd7329eab1e79f952bebf6a7f77b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.491849 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.491832 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/477d8d5a68adc15182b0ab0c3cde7f73-config\") pod \"kube-apiserver-proxy-ip-10-0-135-106.ec2.internal\" (UID: \"477d8d5a68adc15182b0ab0c3cde7f73\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.514257 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.514240 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:25.592592 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.592541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d71cd7329eab1e79f952bebf6a7f77b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal\" (UID: \"8d71cd7329eab1e79f952bebf6a7f77b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.592592 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.592576 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d71cd7329eab1e79f952bebf6a7f77b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal\" (UID: \"8d71cd7329eab1e79f952bebf6a7f77b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.592684 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.592597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/477d8d5a68adc15182b0ab0c3cde7f73-config\") pod \"kube-apiserver-proxy-ip-10-0-135-106.ec2.internal\" (UID: \"477d8d5a68adc15182b0ab0c3cde7f73\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.592684 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.592640 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d71cd7329eab1e79f952bebf6a7f77b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal\" (UID: \"8d71cd7329eab1e79f952bebf6a7f77b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.592747 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.592681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/477d8d5a68adc15182b0ab0c3cde7f73-config\") pod \"kube-apiserver-proxy-ip-10-0-135-106.ec2.internal\" (UID: \"477d8d5a68adc15182b0ab0c3cde7f73\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.592747 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.592707 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d71cd7329eab1e79f952bebf6a7f77b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal\" (UID: \"8d71cd7329eab1e79f952bebf6a7f77b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.614666 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.614647 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:25.715413 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.715392 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:25.753574 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.753557 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.759082 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:25.759065 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" Apr 22 18:43:25.816144 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.816120 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:25.916687 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:25.916609 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:26.017093 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:26.017070 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:26.101715 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.101690 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:43:26.102327 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.101836 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:43:26.117821 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:26.117800 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:26.189161 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.189105 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:43:26.205831 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.205811 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:43:26.218417 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:26.218398 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:26.225425 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.225408 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-z2k5j" Apr 22 18:43:26.234226 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.234207 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-z2k5j" Apr 22 18:43:26.291339 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:26.291308 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod477d8d5a68adc15182b0ab0c3cde7f73.slice/crio-3903b31a2a717a1b48e431df89497b05af95f1663072c664a3f637f56c46334b WatchSource:0}: Error finding container 3903b31a2a717a1b48e431df89497b05af95f1663072c664a3f637f56c46334b: Status 404 returned error can't find the container with id 3903b31a2a717a1b48e431df89497b05af95f1663072c664a3f637f56c46334b Apr 22 18:43:26.291518 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:26.291498 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d71cd7329eab1e79f952bebf6a7f77b.slice/crio-ce6df45c4eec79d1555f062f0f8978baf6b6714757d11febddcabfa923d3aa5e WatchSource:0}: Error finding container ce6df45c4eec79d1555f062f0f8978baf6b6714757d11febddcabfa923d3aa5e: Status 404 returned error can't find the container with id ce6df45c4eec79d1555f062f0f8978baf6b6714757d11febddcabfa923d3aa5e Apr 22 18:43:26.294924 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.294911 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:43:26.314000 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.313961 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:38:25 +0000 UTC" deadline="2028-01-06 02:07:03.180190476 +0000 UTC" Apr 22 18:43:26.314000 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.313998 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14959h23m36.866195275s" Apr 22 18:43:26.316134 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.316112 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:26.318745 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:26.318730 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:26.326817 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.326773 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" event={"ID":"8d71cd7329eab1e79f952bebf6a7f77b","Type":"ContainerStarted","Data":"ce6df45c4eec79d1555f062f0f8978baf6b6714757d11febddcabfa923d3aa5e"} Apr 22 18:43:26.327673 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.327657 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" event={"ID":"477d8d5a68adc15182b0ab0c3cde7f73","Type":"ContainerStarted","Data":"3903b31a2a717a1b48e431df89497b05af95f1663072c664a3f637f56c46334b"} Apr 22 18:43:26.328049 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.328035 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:26.419564 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:26.419516 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:26.520144 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:26.520066 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:26.620572 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:26.620538 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 22 18:43:26.660336 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.660307 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:26.690679 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.690418 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 22 18:43:26.704121 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.704094 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:43:26.704292 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.704265 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" Apr 22 18:43:26.711589 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:26.711489 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:43:27.172270 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.172002 2578 apiserver.go:52] "Watching apiserver" Apr 22 18:43:27.181445 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.181414 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:43:27.182406 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.182375 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-rt5wm","openshift-dns/node-resolver-b9bb8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal","openshift-multus/multus-sdhdb","openshift-multus/network-metrics-daemon-gzjvx","openshift-network-diagnostics/network-check-target-pjr62","openshift-network-operator/iptables-alerter-bzz4l","openshift-ovn-kubernetes/ovnkube-node-mm8rp","kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh","openshift-cluster-node-tuning-operator/tuned-rn9b2","openshift-image-registry/node-ca-6268s","openshift-multus/multus-additional-cni-plugins-jvwbm"] Apr 22 18:43:27.186779 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.186757 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bzz4l" Apr 22 18:43:27.186888 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.186848 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b9bb8" Apr 22 18:43:27.189075 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.189020 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.189651 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.189632 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8bdh6\"" Apr 22 18:43:27.189783 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.189763 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:43:27.189892 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.189823 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:43:27.189892 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.189864 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:43:27.190042 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.190026 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:43:27.190103 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.190029 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qkfkn\"" Apr 22 18:43:27.190156 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.190139 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:43:27.191436 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.191418 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:27.191529 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:27.191488 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:27.193509 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.192250 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:43:27.193509 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.192402 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:43:27.193509 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.192618 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:43:27.193509 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.193117 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:43:27.194920 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.194222 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-82r49\"" Apr 22 18:43:27.198125 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.198101 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:27.198236 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:27.198187 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:27.198236 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.198217 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rt5wm" Apr 22 18:43:27.200550 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.200530 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.200927 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.200898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqwhw\" (UniqueName: \"kubernetes.io/projected/111ee8c4-f2a7-4e7b-8faf-15392cc75774-kube-api-access-xqwhw\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:27.201011 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.200941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4rkl\" (UniqueName: \"kubernetes.io/projected/4de5e80e-562a-46e6-8d36-b01153c2710d-kube-api-access-x4rkl\") pod \"iptables-alerter-bzz4l\" (UID: \"4de5e80e-562a-46e6-8d36-b01153c2710d\") " pod="openshift-network-operator/iptables-alerter-bzz4l" Apr 22 18:43:27.201011 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.200964 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f559fab9-b5a3-456c-8531-308e3635428e-cni-binary-copy\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201011 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.200986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-multus-socket-dir-parent\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201164 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-run-k8s-cni-cncf-io\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201164 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-var-lib-cni-bin\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201164 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201077 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f559fab9-b5a3-456c-8531-308e3635428e-multus-daemon-config\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201164 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201109 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/09d3f614-cd83-4e5b-8bb6-06b778d0eda3-hosts-file\") pod \"node-resolver-b9bb8\" (UID: \"09d3f614-cd83-4e5b-8bb6-06b778d0eda3\") " pod="openshift-dns/node-resolver-b9bb8" Apr 22 18:43:27.201164 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201135 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-system-cni-dir\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201164 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201159 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-var-lib-kubelet\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201218 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-multus-conf-dir\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201237 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-etc-kubernetes\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201255 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201271 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4de5e80e-562a-46e6-8d36-b01153c2710d-host-slash\") pod \"iptables-alerter-bzz4l\" (UID: \"4de5e80e-562a-46e6-8d36-b01153c2710d\") " pod="openshift-network-operator/iptables-alerter-bzz4l" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201279 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201293 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/09d3f614-cd83-4e5b-8bb6-06b778d0eda3-tmp-dir\") pod \"node-resolver-b9bb8\" (UID: \"09d3f614-cd83-4e5b-8bb6-06b778d0eda3\") " pod="openshift-dns/node-resolver-b9bb8" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201294 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jqjm5\"" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w9f7\" (UniqueName: \"kubernetes.io/projected/09d3f614-cd83-4e5b-8bb6-06b778d0eda3-kube-api-access-9w9f7\") pod \"node-resolver-b9bb8\" (UID: \"09d3f614-cd83-4e5b-8bb6-06b778d0eda3\") " pod="openshift-dns/node-resolver-b9bb8" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-multus-cni-dir\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201351 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-cnibin\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201371 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-var-lib-cni-multus\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201395 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4de5e80e-562a-46e6-8d36-b01153c2710d-iptables-alerter-script\") pod \"iptables-alerter-bzz4l\" (UID: \"4de5e80e-562a-46e6-8d36-b01153c2710d\") " pod="openshift-network-operator/iptables-alerter-bzz4l" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201446 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-os-release\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201472 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-run-netns\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201992 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201494 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-hostroot\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201992 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201473 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:43:27.201992 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201531 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-run-multus-certs\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.201992 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.201562 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx6rp\" (UniqueName: \"kubernetes.io/projected/f559fab9-b5a3-456c-8531-308e3635428e-kube-api-access-xx6rp\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.202855 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.202813 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.202855 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.202837 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:43:27.203053 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.203020 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:43:27.203280 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.203267 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:43:27.203322 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.203287 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:43:27.203456 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.203440 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:43:27.203656 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.203640 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:43:27.203830 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.203815 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-7mrr7\"" Apr 22 18:43:27.205139 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.205120 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.205728 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.205711 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:43:27.205972 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.205952 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sdbzg\"" Apr 22 18:43:27.206693 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.206676 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:43:27.207271 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.207252 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:43:27.207506 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.207489 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wl2jm\"" Apr 22 18:43:27.207635 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.207619 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:43:27.207725 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.207716 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6268s" Apr 22 18:43:27.207786 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.207751 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:43:27.210249 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.210231 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.210979 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.210564 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4rbfm\"" Apr 22 18:43:27.210979 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.210571 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:43:27.210979 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.210842 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:43:27.211151 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.211012 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:43:27.212389 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.212369 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-t2cwl\"" Apr 22 18:43:27.212680 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.212662 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:43:27.212893 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.212875 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:43:27.234866 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.234840 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:38:26 +0000 UTC" deadline="2028-01-29 09:15:01.04470108 +0000 UTC" Apr 22 18:43:27.234952 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.234868 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15518h31m33.809838258s" Apr 22 18:43:27.291507 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.291478 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:43:27.302741 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302716 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f559fab9-b5a3-456c-8531-308e3635428e-multus-daemon-config\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.302851 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-run\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.302851 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302767 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdntm\" (UniqueName: \"kubernetes.io/projected/09c19634-8110-452e-9a84-963e44013755-kube-api-access-hdntm\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.302851 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302784 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/09d3f614-cd83-4e5b-8bb6-06b778d0eda3-hosts-file\") pod \"node-resolver-b9bb8\" (UID: \"09d3f614-cd83-4e5b-8bb6-06b778d0eda3\") " pod="openshift-dns/node-resolver-b9bb8" Apr 22 18:43:27.302851 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-system-cni-dir\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.302851 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-multus-conf-dir\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.302851 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302832 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-etc-selinux\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302883 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-modprobe-d\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302918 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-multus-conf-dir\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-cnibin\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302974 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-var-lib-kubelet\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302982 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-system-cni-dir\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-cnibin\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303002 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a42ca4fd-ca90-4584-971c-d1d61ff097f6-serviceca\") pod \"node-ca-6268s\" (UID: \"a42ca4fd-ca90-4584-971c-d1d61ff097f6\") " pod="openshift-image-registry/node-ca-6268s" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.302930 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/09d3f614-cd83-4e5b-8bb6-06b778d0eda3-hosts-file\") pod \"node-resolver-b9bb8\" (UID: \"09d3f614-cd83-4e5b-8bb6-06b778d0eda3\") " pod="openshift-dns/node-resolver-b9bb8" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303034 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-run-netns\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303072 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-run-systemd\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-node-log\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-sys-fs\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303144 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-lib-modules\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.303189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-var-lib-cni-bin\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-var-lib-cni-bin\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303223 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-ovn-node-metrics-cert\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303259 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-var-lib-kubelet\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303304 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f99f\" (UniqueName: \"kubernetes.io/projected/9a5fc927-51a2-476b-8637-3a28218e303a-kube-api-access-5f99f\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303351 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9w9f7\" (UniqueName: \"kubernetes.io/projected/09d3f614-cd83-4e5b-8bb6-06b778d0eda3-kube-api-access-9w9f7\") pod \"node-resolver-b9bb8\" (UID: \"09d3f614-cd83-4e5b-8bb6-06b778d0eda3\") " pod="openshift-dns/node-resolver-b9bb8" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303376 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-multus-cni-dir\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303334 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-var-lib-kubelet\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-cni-netd\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303449 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-sysconfig\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:27.303461 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-multus-cni-dir\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303472 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-sys\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303497 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-run-multus-certs\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-var-lib-openvswitch\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.303800 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303528 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f559fab9-b5a3-456c-8531-308e3635428e-multus-daemon-config\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:27.303547 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs podName:111ee8c4-f2a7-4e7b-8faf-15392cc75774 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:27.803517343 +0000 UTC m=+3.005238208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs") pod "network-metrics-daemon-gzjvx" (UID: "111ee8c4-f2a7-4e7b-8faf-15392cc75774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303579 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-run-multus-certs\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303592 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-etc-openvswitch\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqwhw\" (UniqueName: \"kubernetes.io/projected/111ee8c4-f2a7-4e7b-8faf-15392cc75774-kube-api-access-xqwhw\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4rkl\" (UniqueName: \"kubernetes.io/projected/4de5e80e-562a-46e6-8d36-b01153c2710d-kube-api-access-x4rkl\") pod \"iptables-alerter-bzz4l\" (UID: \"4de5e80e-562a-46e6-8d36-b01153c2710d\") " pod="openshift-network-operator/iptables-alerter-bzz4l" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-run-k8s-cni-cncf-io\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303832 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-sysctl-conf\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303858 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-tmp\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303874 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-run-k8s-cni-cncf-io\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303881 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a42ca4fd-ca90-4584-971c-d1d61ff097f6-host\") pod \"node-ca-6268s\" (UID: \"a42ca4fd-ca90-4584-971c-d1d61ff097f6\") " pod="openshift-image-registry/node-ca-6268s" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303910 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09c19634-8110-452e-9a84-963e44013755-os-release\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303943 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-etc-kubernetes\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-kubelet\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303974 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-run-openvswitch\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303990 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-ovnkube-config\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.304483 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.303999 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-etc-kubernetes\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304016 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09c19634-8110-452e-9a84-963e44013755-system-cni-dir\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09c19634-8110-452e-9a84-963e44013755-cni-binary-copy\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09c19634-8110-452e-9a84-963e44013755-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/09c19634-8110-452e-9a84-963e44013755-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304161 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4de5e80e-562a-46e6-8d36-b01153c2710d-host-slash\") pod \"iptables-alerter-bzz4l\" (UID: \"4de5e80e-562a-46e6-8d36-b01153c2710d\") " pod="openshift-network-operator/iptables-alerter-bzz4l" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4de5e80e-562a-46e6-8d36-b01153c2710d-host-slash\") pod \"iptables-alerter-bzz4l\" (UID: \"4de5e80e-562a-46e6-8d36-b01153c2710d\") " pod="openshift-network-operator/iptables-alerter-bzz4l" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/09d3f614-cd83-4e5b-8bb6-06b778d0eda3-tmp-dir\") pod \"node-resolver-b9bb8\" (UID: \"09d3f614-cd83-4e5b-8bb6-06b778d0eda3\") " pod="openshift-dns/node-resolver-b9bb8" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-slash\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304362 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-run-ovn\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-log-socket\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-env-overrides\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304452 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-kubernetes\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-sysctl-d\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304515 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-os-release\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304517 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/09d3f614-cd83-4e5b-8bb6-06b778d0eda3-tmp-dir\") pod \"node-resolver-b9bb8\" (UID: \"09d3f614-cd83-4e5b-8bb6-06b778d0eda3\") " pod="openshift-dns/node-resolver-b9bb8" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304571 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-run-netns\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.305153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-os-release\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx6rp\" (UniqueName: \"kubernetes.io/projected/f559fab9-b5a3-456c-8531-308e3635428e-kube-api-access-xx6rp\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304625 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-run-netns\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304663 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/51bbe31f-c966-4131-8425-d7f7a16f402e-konnectivity-ca\") pod \"konnectivity-agent-rt5wm\" (UID: \"51bbe31f-c966-4131-8425-d7f7a16f402e\") " pod="kube-system/konnectivity-agent-rt5wm" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304688 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-socket-dir\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-systemd\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-host\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304768 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-tuned\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-multus-socket-dir-parent\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304831 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqn4t\" (UniqueName: \"kubernetes.io/projected/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-kube-api-access-jqn4t\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304857 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/51bbe31f-c966-4131-8425-d7f7a16f402e-agent-certs\") pod \"konnectivity-agent-rt5wm\" (UID: \"51bbe31f-c966-4131-8425-d7f7a16f402e\") " pod="kube-system/konnectivity-agent-rt5wm" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304861 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-multus-socket-dir-parent\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304884 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-device-dir\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304909 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps84d\" (UniqueName: \"kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d\") pod \"network-check-target-pjr62\" (UID: \"37e1b769-57d0-4c74-9a5d-c4eca3f94231\") " pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304935 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-systemd-units\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.305785 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.304985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-cni-bin\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305012 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-registration-dir\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305038 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6wg6\" (UniqueName: \"kubernetes.io/projected/a42ca4fd-ca90-4584-971c-d1d61ff097f6-kube-api-access-n6wg6\") pod \"node-ca-6268s\" (UID: \"a42ca4fd-ca90-4584-971c-d1d61ff097f6\") " pod="openshift-image-registry/node-ca-6268s" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-var-lib-cni-multus\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305091 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305132 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrk9z\" (UniqueName: \"kubernetes.io/projected/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-kube-api-access-lrk9z\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305136 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-host-var-lib-cni-multus\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09c19634-8110-452e-9a84-963e44013755-cnibin\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305206 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4de5e80e-562a-46e6-8d36-b01153c2710d-iptables-alerter-script\") pod \"iptables-alerter-bzz4l\" (UID: \"4de5e80e-562a-46e6-8d36-b01153c2710d\") " pod="openshift-network-operator/iptables-alerter-bzz4l" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305245 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-hostroot\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305282 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-ovnkube-script-lib\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305309 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/09c19634-8110-452e-9a84-963e44013755-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305331 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f559fab9-b5a3-456c-8531-308e3635428e-hostroot\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f559fab9-b5a3-456c-8531-308e3635428e-cni-binary-copy\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305686 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4de5e80e-562a-46e6-8d36-b01153c2710d-iptables-alerter-script\") pod \"iptables-alerter-bzz4l\" (UID: \"4de5e80e-562a-46e6-8d36-b01153c2710d\") " pod="openshift-network-operator/iptables-alerter-bzz4l" Apr 22 18:43:27.306306 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.305795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f559fab9-b5a3-456c-8531-308e3635428e-cni-binary-copy\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.310123 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.310099 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:43:27.311233 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.311212 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:27.313508 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.313483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqwhw\" (UniqueName: \"kubernetes.io/projected/111ee8c4-f2a7-4e7b-8faf-15392cc75774-kube-api-access-xqwhw\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:27.313584 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.313508 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w9f7\" (UniqueName: \"kubernetes.io/projected/09d3f614-cd83-4e5b-8bb6-06b778d0eda3-kube-api-access-9w9f7\") pod \"node-resolver-b9bb8\" (UID: \"09d3f614-cd83-4e5b-8bb6-06b778d0eda3\") " pod="openshift-dns/node-resolver-b9bb8" Apr 22 18:43:27.313584 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.313487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4rkl\" (UniqueName: \"kubernetes.io/projected/4de5e80e-562a-46e6-8d36-b01153c2710d-kube-api-access-x4rkl\") pod \"iptables-alerter-bzz4l\" (UID: \"4de5e80e-562a-46e6-8d36-b01153c2710d\") " pod="openshift-network-operator/iptables-alerter-bzz4l" Apr 22 18:43:27.313822 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.313800 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx6rp\" (UniqueName: \"kubernetes.io/projected/f559fab9-b5a3-456c-8531-308e3635428e-kube-api-access-xx6rp\") pod \"multus-sdhdb\" (UID: \"f559fab9-b5a3-456c-8531-308e3635428e\") " pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.405639 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-cni-bin\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.405639 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405634 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-registration-dir\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.405824 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6wg6\" (UniqueName: \"kubernetes.io/projected/a42ca4fd-ca90-4584-971c-d1d61ff097f6-kube-api-access-n6wg6\") pod \"node-ca-6268s\" (UID: \"a42ca4fd-ca90-4584-971c-d1d61ff097f6\") " pod="openshift-image-registry/node-ca-6268s" Apr 22 18:43:27.405824 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-cni-bin\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.405824 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.405824 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrk9z\" (UniqueName: \"kubernetes.io/projected/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-kube-api-access-lrk9z\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.405824 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405764 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-registration-dir\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.405824 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09c19634-8110-452e-9a84-963e44013755-cnibin\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.405824 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405817 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.406063 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405826 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09c19634-8110-452e-9a84-963e44013755-cnibin\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.406063 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-ovnkube-script-lib\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.406063 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/09c19634-8110-452e-9a84-963e44013755-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.406063 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-run\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.406063 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.405985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdntm\" (UniqueName: \"kubernetes.io/projected/09c19634-8110-452e-9a84-963e44013755-kube-api-access-hdntm\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.406063 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406016 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-etc-selinux\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.406063 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-modprobe-d\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406063 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-run\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-var-lib-kubelet\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a42ca4fd-ca90-4584-971c-d1d61ff097f6-serviceca\") pod \"node-ca-6268s\" (UID: \"a42ca4fd-ca90-4584-971c-d1d61ff097f6\") " pod="openshift-image-registry/node-ca-6268s" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-etc-selinux\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406145 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-run-netns\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-run-systemd\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406194 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-modprobe-d\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-run-systemd\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406231 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-node-log\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406282 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-run-netns\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406288 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-node-log\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406334 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-sys-fs\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406368 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-lib-modules\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.406393 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-var-lib-kubelet\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406394 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-ovn-node-metrics-cert\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406465 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f99f\" (UniqueName: \"kubernetes.io/projected/9a5fc927-51a2-476b-8637-3a28218e303a-kube-api-access-5f99f\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-lib-modules\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406432 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-sys-fs\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406502 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-cni-netd\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406540 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-cni-netd\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406543 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-sysconfig\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406570 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-sys\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-var-lib-openvswitch\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406602 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-etc-openvswitch\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406637 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/09c19634-8110-452e-9a84-963e44013755-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406664 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-sysctl-conf\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406675 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-ovnkube-script-lib\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-var-lib-openvswitch\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406689 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a42ca4fd-ca90-4584-971c-d1d61ff097f6-serviceca\") pod \"node-ca-6268s\" (UID: \"a42ca4fd-ca90-4584-971c-d1d61ff097f6\") " pod="openshift-image-registry/node-ca-6268s" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406721 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-tmp\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406746 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a42ca4fd-ca90-4584-971c-d1d61ff097f6-host\") pod \"node-ca-6268s\" (UID: \"a42ca4fd-ca90-4584-971c-d1d61ff097f6\") " pod="openshift-image-registry/node-ca-6268s" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-etc-openvswitch\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406775 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-sysconfig\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406788 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09c19634-8110-452e-9a84-963e44013755-os-release\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406837 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a42ca4fd-ca90-4584-971c-d1d61ff097f6-host\") pod \"node-ca-6268s\" (UID: \"a42ca4fd-ca90-4584-971c-d1d61ff097f6\") " pod="openshift-image-registry/node-ca-6268s" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406840 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-sys\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406850 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09c19634-8110-452e-9a84-963e44013755-os-release\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-sysctl-conf\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406872 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-kubelet\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-run-openvswitch\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406951 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-ovnkube-config\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406969 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-kubelet\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406972 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-run-openvswitch\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.406978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09c19634-8110-452e-9a84-963e44013755-system-cni-dir\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407016 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09c19634-8110-452e-9a84-963e44013755-cni-binary-copy\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.407743 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407031 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09c19634-8110-452e-9a84-963e44013755-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407070 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09c19634-8110-452e-9a84-963e44013755-system-cni-dir\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/09c19634-8110-452e-9a84-963e44013755-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-slash\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407161 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-run-ovn\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-log-socket\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407212 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-slash\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09c19634-8110-452e-9a84-963e44013755-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-env-overrides\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-kubernetes\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407267 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-log-socket\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407267 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-run-ovn\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407288 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-sysctl-d\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407329 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-kubernetes\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/51bbe31f-c966-4131-8425-d7f7a16f402e-konnectivity-ca\") pod \"konnectivity-agent-rt5wm\" (UID: \"51bbe31f-c966-4131-8425-d7f7a16f402e\") " pod="kube-system/konnectivity-agent-rt5wm" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407371 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-socket-dir\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407486 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-ovnkube-config\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.408591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407474 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-systemd\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407524 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09c19634-8110-452e-9a84-963e44013755-cni-binary-copy\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-host\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407594 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-socket-dir\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-tuned\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqn4t\" (UniqueName: \"kubernetes.io/projected/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-kube-api-access-jqn4t\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-env-overrides\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-systemd\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407673 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/51bbe31f-c966-4131-8425-d7f7a16f402e-agent-certs\") pod \"konnectivity-agent-rt5wm\" (UID: \"51bbe31f-c966-4131-8425-d7f7a16f402e\") " pod="kube-system/konnectivity-agent-rt5wm" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407640 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-sysctl-d\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-host\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407716 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-device-dir\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps84d\" (UniqueName: \"kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d\") pod \"network-check-target-pjr62\" (UID: \"37e1b769-57d0-4c74-9a5d-c4eca3f94231\") " pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407860 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-systemd-units\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407893 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/51bbe31f-c966-4131-8425-d7f7a16f402e-konnectivity-ca\") pod \"konnectivity-agent-rt5wm\" (UID: \"51bbe31f-c966-4131-8425-d7f7a16f402e\") " pod="kube-system/konnectivity-agent-rt5wm" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.409582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.407977 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9a5fc927-51a2-476b-8637-3a28218e303a-device-dir\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.410453 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.408096 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-systemd-units\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.410453 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.408379 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/09c19634-8110-452e-9a84-963e44013755-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.410453 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.409230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-tmp\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.410453 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.410218 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-etc-tuned\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.410770 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.410743 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/51bbe31f-c966-4131-8425-d7f7a16f402e-agent-certs\") pod \"konnectivity-agent-rt5wm\" (UID: \"51bbe31f-c966-4131-8425-d7f7a16f402e\") " pod="kube-system/konnectivity-agent-rt5wm" Apr 22 18:43:27.410770 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.410777 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-ovn-node-metrics-cert\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.413827 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:27.413807 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:27.413927 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:27.413831 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:27.413927 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:27.413860 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ps84d for pod openshift-network-diagnostics/network-check-target-pjr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:27.414051 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:27.413943 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d podName:37e1b769-57d0-4c74-9a5d-c4eca3f94231 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:27.913923915 +0000 UTC m=+3.115644781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ps84d" (UniqueName: "kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d") pod "network-check-target-pjr62" (UID: "37e1b769-57d0-4c74-9a5d-c4eca3f94231") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:27.416451 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.416425 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrk9z\" (UniqueName: \"kubernetes.io/projected/4be7b7fe-d0ef-4d72-8b6e-bc683687994f-kube-api-access-lrk9z\") pod \"tuned-rn9b2\" (UID: \"4be7b7fe-d0ef-4d72-8b6e-bc683687994f\") " pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.416825 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.416787 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f99f\" (UniqueName: \"kubernetes.io/projected/9a5fc927-51a2-476b-8637-3a28218e303a-kube-api-access-5f99f\") pod \"aws-ebs-csi-driver-node-fnpxh\" (UID: \"9a5fc927-51a2-476b-8637-3a28218e303a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.416924 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.416877 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6wg6\" (UniqueName: \"kubernetes.io/projected/a42ca4fd-ca90-4584-971c-d1d61ff097f6-kube-api-access-n6wg6\") pod \"node-ca-6268s\" (UID: \"a42ca4fd-ca90-4584-971c-d1d61ff097f6\") " pod="openshift-image-registry/node-ca-6268s" Apr 22 18:43:27.417848 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.417820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqn4t\" (UniqueName: \"kubernetes.io/projected/7cca2fcb-981e-45db-b2b9-8fc7b0d093b4-kube-api-access-jqn4t\") pod \"ovnkube-node-mm8rp\" (UID: \"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.418256 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.418229 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdntm\" (UniqueName: \"kubernetes.io/projected/09c19634-8110-452e-9a84-963e44013755-kube-api-access-hdntm\") pod \"multus-additional-cni-plugins-jvwbm\" (UID: \"09c19634-8110-452e-9a84-963e44013755\") " pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.499947 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.499849 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bzz4l" Apr 22 18:43:27.506666 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.506640 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b9bb8" Apr 22 18:43:27.515372 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.515353 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sdhdb" Apr 22 18:43:27.519334 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.519303 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rt5wm" Apr 22 18:43:27.525996 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.525969 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:27.532694 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.532674 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" Apr 22 18:43:27.539246 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.539224 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" Apr 22 18:43:27.545794 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.545773 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6268s" Apr 22 18:43:27.550400 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.550377 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jvwbm" Apr 22 18:43:27.810641 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:27.810567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:27.810794 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:27.810726 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:27.810842 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:27.810794 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs podName:111ee8c4-f2a7-4e7b-8faf-15392cc75774 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:28.810777786 +0000 UTC m=+4.012498632 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs") pod "network-metrics-daemon-gzjvx" (UID: "111ee8c4-f2a7-4e7b-8faf-15392cc75774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:27.932050 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:27.931827 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4de5e80e_562a_46e6_8d36_b01153c2710d.slice/crio-f788f06f3648c7743212aa32449275f362a0f09026259b7c142c5fe7c1beb8c7 WatchSource:0}: Error finding container f788f06f3648c7743212aa32449275f362a0f09026259b7c142c5fe7c1beb8c7: Status 404 returned error can't find the container with id f788f06f3648c7743212aa32449275f362a0f09026259b7c142c5fe7c1beb8c7 Apr 22 18:43:27.936065 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:27.936044 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a5fc927_51a2_476b_8637_3a28218e303a.slice/crio-a89534b63b757a6705915f830b0b8d05e2c9935615307f3b88c8d1a1ceaa25d9 WatchSource:0}: Error finding container a89534b63b757a6705915f830b0b8d05e2c9935615307f3b88c8d1a1ceaa25d9: Status 404 returned error can't find the container with id a89534b63b757a6705915f830b0b8d05e2c9935615307f3b88c8d1a1ceaa25d9 Apr 22 18:43:27.938794 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:27.938726 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cca2fcb_981e_45db_b2b9_8fc7b0d093b4.slice/crio-4417e8ef8c56192bf9db38e17aab1a72831830206e8cf9c8e14183f46aa1c183 WatchSource:0}: Error finding container 4417e8ef8c56192bf9db38e17aab1a72831830206e8cf9c8e14183f46aa1c183: Status 404 returned error can't find the container with id 4417e8ef8c56192bf9db38e17aab1a72831830206e8cf9c8e14183f46aa1c183 Apr 22 18:43:27.939515 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:27.939460 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4be7b7fe_d0ef_4d72_8b6e_bc683687994f.slice/crio-8424a9d22daee308411bcb7f6b3144e1c68dc9a54e0e4886e6d91cd47454fd58 WatchSource:0}: Error finding container 8424a9d22daee308411bcb7f6b3144e1c68dc9a54e0e4886e6d91cd47454fd58: Status 404 returned error can't find the container with id 8424a9d22daee308411bcb7f6b3144e1c68dc9a54e0e4886e6d91cd47454fd58 Apr 22 18:43:27.940404 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:27.940335 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d3f614_cd83_4e5b_8bb6_06b778d0eda3.slice/crio-b292f65c1bf8343dcd66713c79ef3b08cb0174a8c9a81447a54d6c85f4cb55e5 WatchSource:0}: Error finding container b292f65c1bf8343dcd66713c79ef3b08cb0174a8c9a81447a54d6c85f4cb55e5: Status 404 returned error can't find the container with id b292f65c1bf8343dcd66713c79ef3b08cb0174a8c9a81447a54d6c85f4cb55e5 Apr 22 18:43:27.941809 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:27.941792 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda42ca4fd_ca90_4584_971c_d1d61ff097f6.slice/crio-7e5902178fc282cdbc71409866cb6ff608b0c6a007b1d438294cc7af16b8c0af WatchSource:0}: Error finding container 7e5902178fc282cdbc71409866cb6ff608b0c6a007b1d438294cc7af16b8c0af: Status 404 returned error can't find the container with id 7e5902178fc282cdbc71409866cb6ff608b0c6a007b1d438294cc7af16b8c0af Apr 22 18:43:27.963527 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:27.963506 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51bbe31f_c966_4131_8425_d7f7a16f402e.slice/crio-d501100f7a4d7f7318d36f9b1afe75d14356783d34a3cfb852aae82413f6239a WatchSource:0}: Error finding container d501100f7a4d7f7318d36f9b1afe75d14356783d34a3cfb852aae82413f6239a: Status 404 returned error can't find the container with id d501100f7a4d7f7318d36f9b1afe75d14356783d34a3cfb852aae82413f6239a Apr 22 18:43:27.964572 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:27.964549 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09c19634_8110_452e_9a84_963e44013755.slice/crio-2e033fba03037ae835826c96bced0e893f35808e3757b82a3608bb411766f4d7 WatchSource:0}: Error finding container 2e033fba03037ae835826c96bced0e893f35808e3757b82a3608bb411766f4d7: Status 404 returned error can't find the container with id 2e033fba03037ae835826c96bced0e893f35808e3757b82a3608bb411766f4d7 Apr 22 18:43:27.965796 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:27.965766 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf559fab9_b5a3_456c_8531_308e3635428e.slice/crio-928261709eb875d5d56795d9aa7a49fad75dfb819baee1295c9c4e74094063ee WatchSource:0}: Error finding container 928261709eb875d5d56795d9aa7a49fad75dfb819baee1295c9c4e74094063ee: Status 404 returned error can't find the container with id 928261709eb875d5d56795d9aa7a49fad75dfb819baee1295c9c4e74094063ee Apr 22 18:43:28.012216 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.012169 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps84d\" (UniqueName: \"kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d\") pod \"network-check-target-pjr62\" (UID: \"37e1b769-57d0-4c74-9a5d-c4eca3f94231\") " pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:28.012349 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:28.012299 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:28.012349 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:28.012315 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:28.012349 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:28.012324 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ps84d for pod openshift-network-diagnostics/network-check-target-pjr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:28.012675 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:28.012373 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d podName:37e1b769-57d0-4c74-9a5d-c4eca3f94231 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:29.012356935 +0000 UTC m=+4.214077785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ps84d" (UniqueName: "kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d") pod "network-check-target-pjr62" (UID: "37e1b769-57d0-4c74-9a5d-c4eca3f94231") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:28.235096 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.234968 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:38:26 +0000 UTC" deadline="2027-11-12 19:59:31.985164974 +0000 UTC" Apr 22 18:43:28.235096 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.235018 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13657h16m3.750151503s" Apr 22 18:43:28.325438 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.324906 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:28.325438 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:28.325042 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:28.325438 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.325132 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:28.325438 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:28.325286 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:28.332424 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.332373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sdhdb" event={"ID":"f559fab9-b5a3-456c-8531-308e3635428e","Type":"ContainerStarted","Data":"928261709eb875d5d56795d9aa7a49fad75dfb819baee1295c9c4e74094063ee"} Apr 22 18:43:28.333859 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.333837 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jvwbm" event={"ID":"09c19634-8110-452e-9a84-963e44013755","Type":"ContainerStarted","Data":"2e033fba03037ae835826c96bced0e893f35808e3757b82a3608bb411766f4d7"} Apr 22 18:43:28.335266 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.335241 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rt5wm" event={"ID":"51bbe31f-c966-4131-8425-d7f7a16f402e","Type":"ContainerStarted","Data":"d501100f7a4d7f7318d36f9b1afe75d14356783d34a3cfb852aae82413f6239a"} Apr 22 18:43:28.336777 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.336732 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b9bb8" event={"ID":"09d3f614-cd83-4e5b-8bb6-06b778d0eda3","Type":"ContainerStarted","Data":"b292f65c1bf8343dcd66713c79ef3b08cb0174a8c9a81447a54d6c85f4cb55e5"} Apr 22 18:43:28.339006 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.338940 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" event={"ID":"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4","Type":"ContainerStarted","Data":"4417e8ef8c56192bf9db38e17aab1a72831830206e8cf9c8e14183f46aa1c183"} Apr 22 18:43:28.342923 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.342861 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" event={"ID":"9a5fc927-51a2-476b-8637-3a28218e303a","Type":"ContainerStarted","Data":"a89534b63b757a6705915f830b0b8d05e2c9935615307f3b88c8d1a1ceaa25d9"} Apr 22 18:43:28.350305 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.350278 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" event={"ID":"477d8d5a68adc15182b0ab0c3cde7f73","Type":"ContainerStarted","Data":"5da8dfe343d0f96aa6dd8365020999afae72565453ee02be6c0f0bf4f9ff6147"} Apr 22 18:43:28.355355 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.355324 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" event={"ID":"4be7b7fe-d0ef-4d72-8b6e-bc683687994f","Type":"ContainerStarted","Data":"8424a9d22daee308411bcb7f6b3144e1c68dc9a54e0e4886e6d91cd47454fd58"} Apr 22 18:43:28.356233 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.356212 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6268s" event={"ID":"a42ca4fd-ca90-4584-971c-d1d61ff097f6","Type":"ContainerStarted","Data":"7e5902178fc282cdbc71409866cb6ff608b0c6a007b1d438294cc7af16b8c0af"} Apr 22 18:43:28.359029 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.358979 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bzz4l" event={"ID":"4de5e80e-562a-46e6-8d36-b01153c2710d","Type":"ContainerStarted","Data":"f788f06f3648c7743212aa32449275f362a0f09026259b7c142c5fe7c1beb8c7"} Apr 22 18:43:28.818053 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:28.818019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:28.818259 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:28.818160 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:28.818259 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:28.818242 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs podName:111ee8c4-f2a7-4e7b-8faf-15392cc75774 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:30.818223395 +0000 UTC m=+6.019944245 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs") pod "network-metrics-daemon-gzjvx" (UID: "111ee8c4-f2a7-4e7b-8faf-15392cc75774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:29.020671 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.020586 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps84d\" (UniqueName: \"kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d\") pod \"network-check-target-pjr62\" (UID: \"37e1b769-57d0-4c74-9a5d-c4eca3f94231\") " pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:29.020856 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:29.020771 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:29.020856 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:29.020796 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:29.020856 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:29.020810 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ps84d for pod openshift-network-diagnostics/network-check-target-pjr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:29.021025 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:29.020871 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d podName:37e1b769-57d0-4c74-9a5d-c4eca3f94231 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:31.020851765 +0000 UTC m=+6.222572626 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ps84d" (UniqueName: "kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d") pod "network-check-target-pjr62" (UID: "37e1b769-57d0-4c74-9a5d-c4eca3f94231") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:29.371796 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.371699 2578 generic.go:358] "Generic (PLEG): container finished" podID="8d71cd7329eab1e79f952bebf6a7f77b" containerID="def3912b0c386edd6f42f74da95e79bb54c6568b3e1f66508010eae1d9ac1b2f" exitCode=0 Apr 22 18:43:29.372703 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.372673 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" event={"ID":"8d71cd7329eab1e79f952bebf6a7f77b","Type":"ContainerDied","Data":"def3912b0c386edd6f42f74da95e79bb54c6568b3e1f66508010eae1d9ac1b2f"} Apr 22 18:43:29.399260 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.398115 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" podStartSLOduration=3.398097236 podStartE2EDuration="3.398097236s" podCreationTimestamp="2026-04-22 18:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:43:28.370016584 +0000 UTC m=+3.571737454" watchObservedRunningTime="2026-04-22 18:43:29.398097236 +0000 UTC m=+4.599818094" Apr 22 18:43:29.771973 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.771152 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-q9qsg"] Apr 22 18:43:29.774505 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.774481 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:29.774627 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:29.774561 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:29.827337 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.827292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/827aef15-283e-49d7-8df1-ebaee65d73aa-dbus\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:29.827519 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.827366 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/827aef15-283e-49d7-8df1-ebaee65d73aa-kubelet-config\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:29.827519 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.827435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:29.928978 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.928246 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/827aef15-283e-49d7-8df1-ebaee65d73aa-kubelet-config\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:29.928978 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.928338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:29.928978 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.928380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/827aef15-283e-49d7-8df1-ebaee65d73aa-dbus\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:29.928978 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.928548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/827aef15-283e-49d7-8df1-ebaee65d73aa-dbus\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:29.928978 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:29.928608 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/827aef15-283e-49d7-8df1-ebaee65d73aa-kubelet-config\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:29.928978 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:29.928700 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:29.928978 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:29.928754 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret podName:827aef15-283e-49d7-8df1-ebaee65d73aa nodeName:}" failed. No retries permitted until 2026-04-22 18:43:30.428736864 +0000 UTC m=+5.630457711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret") pod "global-pull-secret-syncer-q9qsg" (UID: "827aef15-283e-49d7-8df1-ebaee65d73aa") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:30.324624 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:30.324553 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:30.324814 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:30.324674 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:30.325091 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:30.325071 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:30.325234 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:30.325198 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:30.385317 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:30.385283 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" event={"ID":"8d71cd7329eab1e79f952bebf6a7f77b","Type":"ContainerStarted","Data":"1264a424c12b2421b816ab41d5154d7fc093d59fec474a381ca0d037d304e8d5"} Apr 22 18:43:30.432655 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:30.432620 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:30.432832 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:30.432817 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:30.432890 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:30.432876 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret podName:827aef15-283e-49d7-8df1-ebaee65d73aa nodeName:}" failed. No retries permitted until 2026-04-22 18:43:31.432859723 +0000 UTC m=+6.634580575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret") pod "global-pull-secret-syncer-q9qsg" (UID: "827aef15-283e-49d7-8df1-ebaee65d73aa") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:30.835581 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:30.835545 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:30.835767 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:30.835696 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:30.835767 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:30.835757 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs podName:111ee8c4-f2a7-4e7b-8faf-15392cc75774 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:34.83574027 +0000 UTC m=+10.037461124 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs") pod "network-metrics-daemon-gzjvx" (UID: "111ee8c4-f2a7-4e7b-8faf-15392cc75774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:31.037580 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:31.037541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps84d\" (UniqueName: \"kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d\") pod \"network-check-target-pjr62\" (UID: \"37e1b769-57d0-4c74-9a5d-c4eca3f94231\") " pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:31.037773 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:31.037711 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:31.037773 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:31.037729 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:31.037773 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:31.037739 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ps84d for pod openshift-network-diagnostics/network-check-target-pjr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:31.037930 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:31.037788 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d podName:37e1b769-57d0-4c74-9a5d-c4eca3f94231 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:35.037774733 +0000 UTC m=+10.239495580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ps84d" (UniqueName: "kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d") pod "network-check-target-pjr62" (UID: "37e1b769-57d0-4c74-9a5d-c4eca3f94231") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:31.324500 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:31.324384 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:31.324648 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:31.324520 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:31.441084 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:31.441040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:31.441542 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:31.441210 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:31.441542 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:31.441272 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret podName:827aef15-283e-49d7-8df1-ebaee65d73aa nodeName:}" failed. No retries permitted until 2026-04-22 18:43:33.441252777 +0000 UTC m=+8.642973653 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret") pod "global-pull-secret-syncer-q9qsg" (UID: "827aef15-283e-49d7-8df1-ebaee65d73aa") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:32.324353 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:32.324324 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:32.324522 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:32.324450 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:32.324826 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:32.324808 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:32.324929 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:32.324911 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:33.324850 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:33.324811 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:33.325318 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:33.324946 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:33.459144 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:33.459110 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:33.459305 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:33.459288 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:33.459357 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:33.459351 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret podName:827aef15-283e-49d7-8df1-ebaee65d73aa nodeName:}" failed. No retries permitted until 2026-04-22 18:43:37.459332406 +0000 UTC m=+12.661053308 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret") pod "global-pull-secret-syncer-q9qsg" (UID: "827aef15-283e-49d7-8df1-ebaee65d73aa") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:34.325067 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:34.325034 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:34.325467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:34.325033 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:34.325467 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:34.325203 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:34.325467 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:34.325241 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:34.869753 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:34.869712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:34.869925 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:34.869842 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:34.869925 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:34.869922 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs podName:111ee8c4-f2a7-4e7b-8faf-15392cc75774 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:42.869901349 +0000 UTC m=+18.071622201 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs") pod "network-metrics-daemon-gzjvx" (UID: "111ee8c4-f2a7-4e7b-8faf-15392cc75774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:35.071278 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:35.071236 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps84d\" (UniqueName: \"kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d\") pod \"network-check-target-pjr62\" (UID: \"37e1b769-57d0-4c74-9a5d-c4eca3f94231\") " pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:35.071440 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:35.071421 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:35.071518 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:35.071443 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:35.071518 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:35.071457 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ps84d for pod openshift-network-diagnostics/network-check-target-pjr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:35.071518 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:35.071514 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d podName:37e1b769-57d0-4c74-9a5d-c4eca3f94231 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:43.071497435 +0000 UTC m=+18.273218298 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ps84d" (UniqueName: "kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d") pod "network-check-target-pjr62" (UID: "37e1b769-57d0-4c74-9a5d-c4eca3f94231") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:35.325044 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:35.324961 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:35.325212 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:35.325090 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:36.324632 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:36.324592 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:36.324804 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:36.324645 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:36.324804 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:36.324767 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:36.324908 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:36.324881 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:37.324791 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:37.324759 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:37.325170 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:37.324885 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:37.492160 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:37.492072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:37.492329 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:37.492215 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:37.492329 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:37.492281 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret podName:827aef15-283e-49d7-8df1-ebaee65d73aa nodeName:}" failed. No retries permitted until 2026-04-22 18:43:45.492265852 +0000 UTC m=+20.693986722 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret") pod "global-pull-secret-syncer-q9qsg" (UID: "827aef15-283e-49d7-8df1-ebaee65d73aa") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:38.324593 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:38.324558 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:38.324742 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:38.324558 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:38.324742 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:38.324675 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:38.324852 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:38.324740 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:39.325332 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:39.325282 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:39.325778 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:39.325416 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:40.324763 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:40.324728 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:40.324944 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:40.324730 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:40.324944 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:40.324849 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:40.325047 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:40.324945 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:41.325084 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:41.325051 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:41.325509 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:41.325208 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:42.324650 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:42.324617 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:42.324846 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:42.324617 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:42.324846 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:42.324759 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:42.324846 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:42.324805 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:42.932286 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:42.932253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:42.932673 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:42.932407 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:42.932673 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:42.932476 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs podName:111ee8c4-f2a7-4e7b-8faf-15392cc75774 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:58.932460861 +0000 UTC m=+34.134181711 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs") pod "network-metrics-daemon-gzjvx" (UID: "111ee8c4-f2a7-4e7b-8faf-15392cc75774") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:43.133962 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:43.133927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps84d\" (UniqueName: \"kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d\") pod \"network-check-target-pjr62\" (UID: \"37e1b769-57d0-4c74-9a5d-c4eca3f94231\") " pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:43.134234 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:43.134095 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:43.134234 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:43.134119 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:43.134234 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:43.134134 2578 projected.go:194] Error preparing data for projected volume kube-api-access-ps84d for pod openshift-network-diagnostics/network-check-target-pjr62: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:43.134234 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:43.134209 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d podName:37e1b769-57d0-4c74-9a5d-c4eca3f94231 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:59.13419025 +0000 UTC m=+34.335911113 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ps84d" (UniqueName: "kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d") pod "network-check-target-pjr62" (UID: "37e1b769-57d0-4c74-9a5d-c4eca3f94231") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:43.325389 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:43.325310 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:43.325536 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:43.325433 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:44.324891 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:44.324859 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:44.324891 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:44.324875 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:44.325403 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:44.324957 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:44.325403 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:44.325049 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:45.325472 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.325316 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:45.325961 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:45.325540 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:45.413653 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.413606 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sdhdb" event={"ID":"f559fab9-b5a3-456c-8531-308e3635428e","Type":"ContainerStarted","Data":"f6ed82392fb4f99c27f8e87596db1e01dadcbd6d8c893eaab4d26bf41f986c0d"} Apr 22 18:43:45.414915 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.414892 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jvwbm" event={"ID":"09c19634-8110-452e-9a84-963e44013755","Type":"ContainerStarted","Data":"fb965003a41c3e9867e5c3b903c5d30ca6b63d0be788f9d9fa3ffafde1ed6538"} Apr 22 18:43:45.416152 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.416116 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rt5wm" event={"ID":"51bbe31f-c966-4131-8425-d7f7a16f402e","Type":"ContainerStarted","Data":"5d86c40e8af7b2f9b48077d0e94cb1cb32cb4080f914ce28b2805b7f9c172c3c"} Apr 22 18:43:45.417398 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.417374 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b9bb8" event={"ID":"09d3f614-cd83-4e5b-8bb6-06b778d0eda3","Type":"ContainerStarted","Data":"6d55eb8c7eb9f303a711ca11f9b88f705b4201132dc9b48cc2b2e6f2c3d03db8"} Apr 22 18:43:45.419639 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.419612 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" event={"ID":"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4","Type":"ContainerStarted","Data":"a7fcd9fb1b89463815f3225b51b8ca51847162a9e520872655ca57710fddd5a8"} Apr 22 18:43:45.419714 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.419640 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" event={"ID":"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4","Type":"ContainerStarted","Data":"84c64d65b5e3b769862bbf64f5d62ba0a83cf6e566f39bd865aee7243bca525f"} Apr 22 18:43:45.420907 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.420886 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" event={"ID":"9a5fc927-51a2-476b-8637-3a28218e303a","Type":"ContainerStarted","Data":"c25d3ffe876287793dbd2a6dda593d42f0b02a8b2a848625c4f99e88c0265b4c"} Apr 22 18:43:45.422132 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.422103 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" event={"ID":"4be7b7fe-d0ef-4d72-8b6e-bc683687994f","Type":"ContainerStarted","Data":"9f0111dadb29d6fc95be969bbddc92ad0cfd1a2c6c08dede5a36eed48fd87235"} Apr 22 18:43:45.423718 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.423304 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6268s" event={"ID":"a42ca4fd-ca90-4584-971c-d1d61ff097f6","Type":"ContainerStarted","Data":"08b62a3a19ee4b1640da28a1689f9194e76688934083bf281e01449ecf33eada"} Apr 22 18:43:45.433878 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.433835 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" podStartSLOduration=19.433816979 podStartE2EDuration="19.433816979s" podCreationTimestamp="2026-04-22 18:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:43:30.413242947 +0000 UTC m=+5.614963816" watchObservedRunningTime="2026-04-22 18:43:45.433816979 +0000 UTC m=+20.635537848" Apr 22 18:43:45.448644 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.448603 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sdhdb" podStartSLOduration=3.419132764 podStartE2EDuration="20.448586694s" podCreationTimestamp="2026-04-22 18:43:25 +0000 UTC" firstStartedPulling="2026-04-22 18:43:27.968237557 +0000 UTC m=+3.169958406" lastFinishedPulling="2026-04-22 18:43:44.997691475 +0000 UTC m=+20.199412336" observedRunningTime="2026-04-22 18:43:45.433144215 +0000 UTC m=+20.634865137" watchObservedRunningTime="2026-04-22 18:43:45.448586694 +0000 UTC m=+20.650307564" Apr 22 18:43:45.449497 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.449290 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-b9bb8" podStartSLOduration=3.453507009 podStartE2EDuration="20.449280502s" podCreationTimestamp="2026-04-22 18:43:25 +0000 UTC" firstStartedPulling="2026-04-22 18:43:27.962568345 +0000 UTC m=+3.164289204" lastFinishedPulling="2026-04-22 18:43:44.958341837 +0000 UTC m=+20.160062697" observedRunningTime="2026-04-22 18:43:45.44871778 +0000 UTC m=+20.650438631" watchObservedRunningTime="2026-04-22 18:43:45.449280502 +0000 UTC m=+20.651001373" Apr 22 18:43:45.467476 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.467413 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rn9b2" podStartSLOduration=3.41095773 podStartE2EDuration="20.467400422s" podCreationTimestamp="2026-04-22 18:43:25 +0000 UTC" firstStartedPulling="2026-04-22 18:43:27.941247182 +0000 UTC m=+3.142968033" lastFinishedPulling="2026-04-22 18:43:44.997689863 +0000 UTC m=+20.199410725" observedRunningTime="2026-04-22 18:43:45.466969218 +0000 UTC m=+20.668690086" watchObservedRunningTime="2026-04-22 18:43:45.467400422 +0000 UTC m=+20.669121291" Apr 22 18:43:45.481780 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.481738 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rt5wm" podStartSLOduration=3.488899333 podStartE2EDuration="20.481723563s" podCreationTimestamp="2026-04-22 18:43:25 +0000 UTC" firstStartedPulling="2026-04-22 18:43:27.96546931 +0000 UTC m=+3.167190174" lastFinishedPulling="2026-04-22 18:43:44.958293547 +0000 UTC m=+20.160014404" observedRunningTime="2026-04-22 18:43:45.481592922 +0000 UTC m=+20.683313790" watchObservedRunningTime="2026-04-22 18:43:45.481723563 +0000 UTC m=+20.683444432" Apr 22 18:43:45.496926 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.496885 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6268s" podStartSLOduration=7.937280723 podStartE2EDuration="20.496871869s" podCreationTimestamp="2026-04-22 18:43:25 +0000 UTC" firstStartedPulling="2026-04-22 18:43:27.962442639 +0000 UTC m=+3.164163487" lastFinishedPulling="2026-04-22 18:43:40.522033771 +0000 UTC m=+15.723754633" observedRunningTime="2026-04-22 18:43:45.496846186 +0000 UTC m=+20.698567062" watchObservedRunningTime="2026-04-22 18:43:45.496871869 +0000 UTC m=+20.698592739" Apr 22 18:43:45.551632 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:45.551606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:45.552296 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:45.552273 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:45.552398 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:45.552336 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret podName:827aef15-283e-49d7-8df1-ebaee65d73aa nodeName:}" failed. No retries permitted until 2026-04-22 18:44:01.552316725 +0000 UTC m=+36.754037586 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret") pod "global-pull-secret-syncer-q9qsg" (UID: "827aef15-283e-49d7-8df1-ebaee65d73aa") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:46.324582 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.324548 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:46.324745 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.324548 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:46.324745 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:46.324675 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:46.324745 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:46.324716 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:46.425930 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.425838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bzz4l" event={"ID":"4de5e80e-562a-46e6-8d36-b01153c2710d","Type":"ContainerStarted","Data":"3ea095a7c2953bd60f5646fcc5ec199f6c09e8ef1f945abdb1e87fbca350e7af"} Apr 22 18:43:46.426942 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.426920 2578 generic.go:358] "Generic (PLEG): container finished" podID="09c19634-8110-452e-9a84-963e44013755" containerID="fb965003a41c3e9867e5c3b903c5d30ca6b63d0be788f9d9fa3ffafde1ed6538" exitCode=0 Apr 22 18:43:46.427050 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.426984 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jvwbm" event={"ID":"09c19634-8110-452e-9a84-963e44013755","Type":"ContainerDied","Data":"fb965003a41c3e9867e5c3b903c5d30ca6b63d0be788f9d9fa3ffafde1ed6538"} Apr 22 18:43:46.429266 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.429249 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/ovn-acl-logging/0.log" Apr 22 18:43:46.429541 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.429520 2578 generic.go:358] "Generic (PLEG): container finished" podID="7cca2fcb-981e-45db-b2b9-8fc7b0d093b4" containerID="a7fcd9fb1b89463815f3225b51b8ca51847162a9e520872655ca57710fddd5a8" exitCode=1 Apr 22 18:43:46.429654 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.429613 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" event={"ID":"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4","Type":"ContainerDied","Data":"a7fcd9fb1b89463815f3225b51b8ca51847162a9e520872655ca57710fddd5a8"} Apr 22 18:43:46.429734 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.429664 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" event={"ID":"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4","Type":"ContainerStarted","Data":"2d74d7f2bc9d1bd83691b894b2c79572aaaae5d48c343190411b3492670ee129"} Apr 22 18:43:46.429734 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.429679 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" event={"ID":"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4","Type":"ContainerStarted","Data":"7e2a6221633986f2f7073c1d3a16c8edcab5513ee6e81b0904d788ea491986db"} Apr 22 18:43:46.429734 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.429691 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" event={"ID":"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4","Type":"ContainerStarted","Data":"1d93f1e53374b42e35377f1bd85ed0d94b1a9bd61d6d1aa027a6a11299e208bb"} Apr 22 18:43:46.429734 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.429704 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" event={"ID":"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4","Type":"ContainerStarted","Data":"495a22d9f4cc32b565d727c035f929df5bc7e8137bad3c20dab2b86561132200"} Apr 22 18:43:46.442481 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.442447 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-bzz4l" podStartSLOduration=4.420310347 podStartE2EDuration="21.442435409s" podCreationTimestamp="2026-04-22 18:43:25 +0000 UTC" firstStartedPulling="2026-04-22 18:43:27.936100856 +0000 UTC m=+3.137821709" lastFinishedPulling="2026-04-22 18:43:44.958225922 +0000 UTC m=+20.159946771" observedRunningTime="2026-04-22 18:43:46.442069665 +0000 UTC m=+21.643790537" watchObservedRunningTime="2026-04-22 18:43:46.442435409 +0000 UTC m=+21.644156318" Apr 22 18:43:46.642074 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:46.642039 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:43:47.245635 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:47.245600 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rt5wm" Apr 22 18:43:47.246538 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:47.246362 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rt5wm" Apr 22 18:43:47.264781 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:47.264631 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:43:46.642061281Z","UUID":"ef883c98-effb-4f5b-b19a-e7ed58a68372","Handler":null,"Name":"","Endpoint":""} Apr 22 18:43:47.266781 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:47.266762 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:43:47.266781 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:47.266785 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:43:47.325093 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:47.325052 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:47.325265 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:47.325171 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:47.434298 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:47.434255 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" event={"ID":"9a5fc927-51a2-476b-8637-3a28218e303a","Type":"ContainerStarted","Data":"b6a549c6492803de27abfe143c46f71af31f8629c346014556dd435d7eea78f0"} Apr 22 18:43:47.434298 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:47.434305 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rt5wm" Apr 22 18:43:47.434828 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:47.434626 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rt5wm" Apr 22 18:43:48.324587 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:48.324550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:48.324757 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:48.324675 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:48.324757 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:48.324746 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:48.324902 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:48.324879 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:48.438574 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:48.438540 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/ovn-acl-logging/0.log" Apr 22 18:43:48.439092 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:48.438918 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" event={"ID":"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4","Type":"ContainerStarted","Data":"8a237fe3e0f00ff7b941ddec7456efe1ec87b73084a1215c65a2d2c05590b09c"} Apr 22 18:43:48.440962 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:48.440933 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" event={"ID":"9a5fc927-51a2-476b-8637-3a28218e303a","Type":"ContainerStarted","Data":"36f0850335faeebb0300a1b5ccc33f87a5f759cae3fcbcef8bd3f2756c5a16f8"} Apr 22 18:43:48.464558 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:48.464516 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fnpxh" podStartSLOduration=3.536758881 podStartE2EDuration="23.464499323s" podCreationTimestamp="2026-04-22 18:43:25 +0000 UTC" firstStartedPulling="2026-04-22 18:43:27.938686672 +0000 UTC m=+3.140407518" lastFinishedPulling="2026-04-22 18:43:47.866427098 +0000 UTC m=+23.068147960" observedRunningTime="2026-04-22 18:43:48.464330361 +0000 UTC m=+23.666051242" watchObservedRunningTime="2026-04-22 18:43:48.464499323 +0000 UTC m=+23.666220194" Apr 22 18:43:49.324475 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:49.324441 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:49.324630 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:49.324568 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:50.324541 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:50.324304 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:50.324541 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:50.324373 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:50.325113 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:50.324559 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:50.325113 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:50.324629 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:50.446584 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:50.446558 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/ovn-acl-logging/0.log" Apr 22 18:43:50.446853 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:50.446829 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" event={"ID":"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4","Type":"ContainerStarted","Data":"cf50a7a2e5e152c5072871ad46294746f17401b0381608100e4057260239cdd2"} Apr 22 18:43:50.447194 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:50.447159 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:50.447257 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:50.447209 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:50.447354 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:50.447338 2578 scope.go:117] "RemoveContainer" containerID="a7fcd9fb1b89463815f3225b51b8ca51847162a9e520872655ca57710fddd5a8" Apr 22 18:43:50.466381 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:50.466359 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:51.324379 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:51.324334 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:51.324527 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:51.324463 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:51.451875 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:51.451850 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/ovn-acl-logging/0.log" Apr 22 18:43:51.452327 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:51.452215 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" event={"ID":"7cca2fcb-981e-45db-b2b9-8fc7b0d093b4","Type":"ContainerStarted","Data":"506b869bc0fc2407295b854d8f5ef384edef9f7907019935c50af1f5188b2edf"} Apr 22 18:43:51.452601 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:51.452573 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:51.469670 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:51.469647 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:43:51.488365 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:51.488326 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" podStartSLOduration=9.364511731 podStartE2EDuration="26.488314499s" podCreationTimestamp="2026-04-22 18:43:25 +0000 UTC" firstStartedPulling="2026-04-22 18:43:27.940748929 +0000 UTC m=+3.142469776" lastFinishedPulling="2026-04-22 18:43:45.064551692 +0000 UTC m=+20.266272544" observedRunningTime="2026-04-22 18:43:51.487759774 +0000 UTC m=+26.689480645" watchObservedRunningTime="2026-04-22 18:43:51.488314499 +0000 UTC m=+26.690035369" Apr 22 18:43:52.325259 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:52.325226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:52.325445 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:52.325227 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:52.325445 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:52.325328 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:52.325529 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:52.325435 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:52.356722 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:52.356683 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q9qsg"] Apr 22 18:43:52.356883 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:52.356809 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:52.356943 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:52.356908 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:52.359296 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:52.359269 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gzjvx"] Apr 22 18:43:52.372238 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:52.372214 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pjr62"] Apr 22 18:43:52.454319 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:52.454292 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:52.454686 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:52.454295 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:52.454686 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:52.454384 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:52.454801 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:52.454779 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:54.324690 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:54.324652 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:54.325150 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:54.324778 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:54.325276 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:54.325254 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:54.325399 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:54.325368 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:54.325521 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:54.325431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:54.325521 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:54.325489 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:55.462049 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:55.461875 2578 generic.go:358] "Generic (PLEG): container finished" podID="09c19634-8110-452e-9a84-963e44013755" containerID="6cea59cf1fdecdcbc2caefe829c39035ef9be5c569556d70e4a98e8aad35131f" exitCode=0 Apr 22 18:43:55.462049 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:55.461968 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jvwbm" event={"ID":"09c19634-8110-452e-9a84-963e44013755","Type":"ContainerDied","Data":"6cea59cf1fdecdcbc2caefe829c39035ef9be5c569556d70e4a98e8aad35131f"} Apr 22 18:43:56.324990 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:56.324955 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:56.325147 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:56.325063 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-q9qsg" podUID="827aef15-283e-49d7-8df1-ebaee65d73aa" Apr 22 18:43:56.325147 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:56.325066 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:56.325147 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:56.325091 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:56.325147 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:56.325133 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjr62" podUID="37e1b769-57d0-4c74-9a5d-c4eca3f94231" Apr 22 18:43:56.325301 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:56.325234 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:43:57.467619 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:57.467583 2578 generic.go:358] "Generic (PLEG): container finished" podID="09c19634-8110-452e-9a84-963e44013755" containerID="00e3e8dededad1745d180800ec4aa9ad9209a0daafa1953cbe80e0a02c57da00" exitCode=0 Apr 22 18:43:57.468252 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:57.467650 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jvwbm" event={"ID":"09c19634-8110-452e-9a84-963e44013755","Type":"ContainerDied","Data":"00e3e8dededad1745d180800ec4aa9ad9209a0daafa1953cbe80e0a02c57da00"} Apr 22 18:43:58.157370 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.157307 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeReady" Apr 22 18:43:58.157563 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.157426 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:43:58.200074 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.200043 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc"] Apr 22 18:43:58.206698 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.206676 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5f7876885c-jjjr6"] Apr 22 18:43:58.206812 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.206798 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" Apr 22 18:43:58.209317 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.209261 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 18:43:58.209443 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.209388 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-xhf5f\"" Apr 22 18:43:58.209443 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.209414 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:43:58.209606 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.209466 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:43:58.209918 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.209900 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:43:58.214227 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.214210 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz"] Apr 22 18:43:58.214374 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.214357 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.217993 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.217964 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nnjgq\"" Apr 22 18:43:58.218125 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.217966 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:43:58.218125 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.218006 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:43:58.218394 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.218378 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:43:58.222205 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.222168 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7"] Apr 22 18:43:58.222312 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.222288 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:43:58.225222 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.225203 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 18:43:58.228632 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.228611 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:43:58.232583 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.232466 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc"] Apr 22 18:43:58.232583 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.232483 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz"] Apr 22 18:43:58.232583 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.232498 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f7876885c-jjjr6"] Apr 22 18:43:58.232583 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.232511 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7"] Apr 22 18:43:58.232583 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.232523 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vkwg8"] Apr 22 18:43:58.232826 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.232595 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.235532 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.235511 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 18:43:58.235644 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.235618 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 18:43:58.236189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.236162 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 18:43:58.236851 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.236830 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 18:43:58.244894 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.244874 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-btplr"] Apr 22 18:43:58.245077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.245063 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.249795 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.249779 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xcpp5\"" Apr 22 18:43:58.249875 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.249830 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:43:58.249875 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.249853 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:43:58.254674 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.254648 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vkwg8"] Apr 22 18:43:58.254674 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.254667 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-btplr"] Apr 22 18:43:58.254815 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.254755 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:43:58.257120 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.257102 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:43:58.257558 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.257539 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:43:58.258146 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.258061 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xmncv\"" Apr 22 18:43:58.258246 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.258223 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:43:58.325158 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.325132 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:58.325312 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.325132 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:58.325373 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.325132 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:43:58.327616 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.327601 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mgrtw\"" Apr 22 18:43:58.327716 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.327601 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:43:58.327791 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.327777 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:43:58.327791 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.327783 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:43:58.327891 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.327817 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c6clv\"" Apr 22 18:43:58.327891 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.327842 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:43:58.347307 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347288 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f083405e-bfba-4773-8229-17a3c99d04cb-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7766cfd6d7-24vwc\" (UID: \"f083405e-bfba-4773-8229-17a3c99d04cb\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" Apr 22 18:43:58.347402 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347315 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-hub\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.347402 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347334 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.347402 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347349 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.347402 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqgbm\" (UniqueName: \"kubernetes.io/projected/c89a564f-52a0-4145-8cad-2ae9ecec0329-kube-api-access-wqgbm\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.347525 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347466 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7a0225c7-98ca-4a53-925d-9a970a77218d-image-registry-private-configuration\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.347525 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347489 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.347525 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-bound-sa-token\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.347614 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347523 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vldsm\" (UniqueName: \"kubernetes.io/projected/cef89a60-b37e-40b5-8a2e-0a715b7661ce-kube-api-access-vldsm\") pod \"klusterlet-addon-workmgr-7d588db7bf-8v7mz\" (UID: \"cef89a60-b37e-40b5-8a2e-0a715b7661ce\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:43:58.347614 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwzgh\" (UniqueName: \"kubernetes.io/projected/f083405e-bfba-4773-8229-17a3c99d04cb-kube-api-access-mwzgh\") pod \"managed-serviceaccount-addon-agent-7766cfd6d7-24vwc\" (UID: \"f083405e-bfba-4773-8229-17a3c99d04cb\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" Apr 22 18:43:58.347614 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-certificates\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.347704 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347624 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a0225c7-98ca-4a53-925d-9a970a77218d-ca-trust-extracted\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.347704 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cef89a60-b37e-40b5-8a2e-0a715b7661ce-tmp\") pod \"klusterlet-addon-workmgr-7d588db7bf-8v7mz\" (UID: \"cef89a60-b37e-40b5-8a2e-0a715b7661ce\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:43:58.347704 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347665 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c89a564f-52a0-4145-8cad-2ae9ecec0329-tmp-dir\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.347704 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347686 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a0225c7-98ca-4a53-925d-9a970a77218d-installation-pull-secrets\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.347704 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347702 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9t69\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-kube-api-access-g9t69\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.347848 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.347848 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c89a564f-52a0-4145-8cad-2ae9ecec0329-config-volume\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.347848 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lglz7\" (UniqueName: \"kubernetes.io/projected/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-kube-api-access-lglz7\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.347848 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-ca\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.347848 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347790 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/cef89a60-b37e-40b5-8a2e-0a715b7661ce-klusterlet-config\") pod \"klusterlet-addon-workmgr-7d588db7bf-8v7mz\" (UID: \"cef89a60-b37e-40b5-8a2e-0a715b7661ce\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:43:58.347848 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347804 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:43:58.347848 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvzxh\" (UniqueName: \"kubernetes.io/projected/f27a9a75-c190-4880-b981-29378f194918-kube-api-access-rvzxh\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:43:58.347848 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347836 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a0225c7-98ca-4a53-925d-9a970a77218d-trusted-ca\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.348078 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.347852 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.449085 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449047 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7a0225c7-98ca-4a53-925d-9a970a77218d-image-registry-private-configuration\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.449274 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449097 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.449274 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449123 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-bound-sa-token\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.449274 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vldsm\" (UniqueName: \"kubernetes.io/projected/cef89a60-b37e-40b5-8a2e-0a715b7661ce-kube-api-access-vldsm\") pod \"klusterlet-addon-workmgr-7d588db7bf-8v7mz\" (UID: \"cef89a60-b37e-40b5-8a2e-0a715b7661ce\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:43:58.449274 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449216 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwzgh\" (UniqueName: \"kubernetes.io/projected/f083405e-bfba-4773-8229-17a3c99d04cb-kube-api-access-mwzgh\") pod \"managed-serviceaccount-addon-agent-7766cfd6d7-24vwc\" (UID: \"f083405e-bfba-4773-8229-17a3c99d04cb\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" Apr 22 18:43:58.449274 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-certificates\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.449507 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449277 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a0225c7-98ca-4a53-925d-9a970a77218d-ca-trust-extracted\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.449507 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cef89a60-b37e-40b5-8a2e-0a715b7661ce-tmp\") pod \"klusterlet-addon-workmgr-7d588db7bf-8v7mz\" (UID: \"cef89a60-b37e-40b5-8a2e-0a715b7661ce\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:43:58.449507 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c89a564f-52a0-4145-8cad-2ae9ecec0329-tmp-dir\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.449507 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a0225c7-98ca-4a53-925d-9a970a77218d-installation-pull-secrets\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.449507 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9t69\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-kube-api-access-g9t69\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.449507 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.449507 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449423 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c89a564f-52a0-4145-8cad-2ae9ecec0329-config-volume\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.449507 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lglz7\" (UniqueName: \"kubernetes.io/projected/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-kube-api-access-lglz7\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.449507 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449481 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-ca\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.449907 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.449511 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:43:58.449907 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.449526 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7876885c-jjjr6: secret "image-registry-tls" not found Apr 22 18:43:58.449907 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.449610 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls podName:7a0225c7-98ca-4a53-925d-9a970a77218d nodeName:}" failed. No retries permitted until 2026-04-22 18:43:58.949581208 +0000 UTC m=+34.151302056 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls") pod "image-registry-5f7876885c-jjjr6" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d") : secret "image-registry-tls" not found Apr 22 18:43:58.449907 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449642 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/cef89a60-b37e-40b5-8a2e-0a715b7661ce-klusterlet-config\") pod \"klusterlet-addon-workmgr-7d588db7bf-8v7mz\" (UID: \"cef89a60-b37e-40b5-8a2e-0a715b7661ce\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:43:58.449907 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.449890 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c89a564f-52a0-4145-8cad-2ae9ecec0329-tmp-dir\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.450156 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.450019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a0225c7-98ca-4a53-925d-9a970a77218d-ca-trust-extracted\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.450238 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.450211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cef89a60-b37e-40b5-8a2e-0a715b7661ce-tmp\") pod \"klusterlet-addon-workmgr-7d588db7bf-8v7mz\" (UID: \"cef89a60-b37e-40b5-8a2e-0a715b7661ce\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.450456 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.450498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvzxh\" (UniqueName: \"kubernetes.io/projected/f27a9a75-c190-4880-b981-29378f194918-kube-api-access-rvzxh\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.450531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a0225c7-98ca-4a53-925d-9a970a77218d-trusted-ca\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.450561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.450611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f083405e-bfba-4773-8229-17a3c99d04cb-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7766cfd6d7-24vwc\" (UID: \"f083405e-bfba-4773-8229-17a3c99d04cb\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.450640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-hub\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.450676 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.450699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.450723 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqgbm\" (UniqueName: \"kubernetes.io/projected/c89a564f-52a0-4145-8cad-2ae9ecec0329-kube-api-access-wqgbm\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.450753 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c89a564f-52a0-4145-8cad-2ae9ecec0329-config-volume\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.450854 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.450911 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert podName:f27a9a75-c190-4880-b981-29378f194918 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:58.950895134 +0000 UTC m=+34.152615998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert") pod "ingress-canary-btplr" (UID: "f27a9a75-c190-4880-b981-29378f194918") : secret "canary-serving-cert" not found Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.450927 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:58.451205 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.450989 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls podName:c89a564f-52a0-4145-8cad-2ae9ecec0329 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:58.950970729 +0000 UTC m=+34.152691585 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls") pod "dns-default-vkwg8" (UID: "c89a564f-52a0-4145-8cad-2ae9ecec0329") : secret "dns-default-metrics-tls" not found Apr 22 18:43:58.452021 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.451998 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a0225c7-98ca-4a53-925d-9a970a77218d-trusted-ca\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.452640 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.452619 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.455316 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.455287 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7a0225c7-98ca-4a53-925d-9a970a77218d-image-registry-private-configuration\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.455420 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.455397 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.455528 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.455501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-hub\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.456696 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.456644 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/cef89a60-b37e-40b5-8a2e-0a715b7661ce-klusterlet-config\") pod \"klusterlet-addon-workmgr-7d588db7bf-8v7mz\" (UID: \"cef89a60-b37e-40b5-8a2e-0a715b7661ce\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:43:58.456877 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.456841 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-ca\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.457552 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.457277 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f083405e-bfba-4773-8229-17a3c99d04cb-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7766cfd6d7-24vwc\" (UID: \"f083405e-bfba-4773-8229-17a3c99d04cb\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" Apr 22 18:43:58.457968 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.457936 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.458665 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.458584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-certificates\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.460103 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.460079 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vldsm\" (UniqueName: \"kubernetes.io/projected/cef89a60-b37e-40b5-8a2e-0a715b7661ce-kube-api-access-vldsm\") pod \"klusterlet-addon-workmgr-7d588db7bf-8v7mz\" (UID: \"cef89a60-b37e-40b5-8a2e-0a715b7661ce\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:43:58.460501 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.460478 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-bound-sa-token\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.460581 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.460542 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvzxh\" (UniqueName: \"kubernetes.io/projected/f27a9a75-c190-4880-b981-29378f194918-kube-api-access-rvzxh\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:43:58.461168 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.461132 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lglz7\" (UniqueName: \"kubernetes.io/projected/88b54b08-2047-4ef2-a2cc-91a34b8ebd53-kube-api-access-lglz7\") pod \"cluster-proxy-proxy-agent-5864889ff5-n5bc7\" (UID: \"88b54b08-2047-4ef2-a2cc-91a34b8ebd53\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.461538 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.461514 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a0225c7-98ca-4a53-925d-9a970a77218d-installation-pull-secrets\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.463765 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.463734 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwzgh\" (UniqueName: \"kubernetes.io/projected/f083405e-bfba-4773-8229-17a3c99d04cb-kube-api-access-mwzgh\") pod \"managed-serviceaccount-addon-agent-7766cfd6d7-24vwc\" (UID: \"f083405e-bfba-4773-8229-17a3c99d04cb\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" Apr 22 18:43:58.466220 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.464810 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqgbm\" (UniqueName: \"kubernetes.io/projected/c89a564f-52a0-4145-8cad-2ae9ecec0329-kube-api-access-wqgbm\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.466220 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.465603 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9t69\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-kube-api-access-g9t69\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.526362 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.526336 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" Apr 22 18:43:58.541947 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.541784 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:43:58.547531 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.547394 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:43:58.695113 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.695025 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz"] Apr 22 18:43:58.697797 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.697775 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7"] Apr 22 18:43:58.700717 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.700696 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc"] Apr 22 18:43:58.703888 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:58.703859 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf083405e_bfba_4773_8229_17a3c99d04cb.slice/crio-b9d1ace34ab764a60e714f04a96e052a6da1c2fc303d00b195bc35db873a63b4 WatchSource:0}: Error finding container b9d1ace34ab764a60e714f04a96e052a6da1c2fc303d00b195bc35db873a63b4: Status 404 returned error can't find the container with id b9d1ace34ab764a60e714f04a96e052a6da1c2fc303d00b195bc35db873a63b4 Apr 22 18:43:58.704476 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:58.704454 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcef89a60_b37e_40b5_8a2e_0a715b7661ce.slice/crio-17ff283ba21685193c3d3ce12c8a592e3c284d79fc5477292f95ee407ee9b938 WatchSource:0}: Error finding container 17ff283ba21685193c3d3ce12c8a592e3c284d79fc5477292f95ee407ee9b938: Status 404 returned error can't find the container with id 17ff283ba21685193c3d3ce12c8a592e3c284d79fc5477292f95ee407ee9b938 Apr 22 18:43:58.954970 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.954894 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:43:58.954970 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.954935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:58.955254 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.954980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:58.955254 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:58.955026 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:43:58.955254 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.955039 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:43:58.955254 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.955101 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs podName:111ee8c4-f2a7-4e7b-8faf-15392cc75774 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:30.95508557 +0000 UTC m=+66.156806417 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs") pod "network-metrics-daemon-gzjvx" (UID: "111ee8c4-f2a7-4e7b-8faf-15392cc75774") : secret "metrics-daemon-secret" not found Apr 22 18:43:58.955254 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.955106 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:58.955254 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.955107 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:58.955254 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.955143 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert podName:f27a9a75-c190-4880-b981-29378f194918 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:59.9551312 +0000 UTC m=+35.156852047 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert") pod "ingress-canary-btplr" (UID: "f27a9a75-c190-4880-b981-29378f194918") : secret "canary-serving-cert" not found Apr 22 18:43:58.955254 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.955140 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:43:58.955254 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.955160 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7876885c-jjjr6: secret "image-registry-tls" not found Apr 22 18:43:58.955254 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.955161 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls podName:c89a564f-52a0-4145-8cad-2ae9ecec0329 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:59.955147757 +0000 UTC m=+35.156868610 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls") pod "dns-default-vkwg8" (UID: "c89a564f-52a0-4145-8cad-2ae9ecec0329") : secret "dns-default-metrics-tls" not found Apr 22 18:43:58.955254 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:58.955230 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls podName:7a0225c7-98ca-4a53-925d-9a970a77218d nodeName:}" failed. No retries permitted until 2026-04-22 18:43:59.955215875 +0000 UTC m=+35.156936742 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls") pod "image-registry-5f7876885c-jjjr6" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d") : secret "image-registry-tls" not found Apr 22 18:43:59.156634 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.156597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps84d\" (UniqueName: \"kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d\") pod \"network-check-target-pjr62\" (UID: \"37e1b769-57d0-4c74-9a5d-c4eca3f94231\") " pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:59.170075 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.170043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps84d\" (UniqueName: \"kubernetes.io/projected/37e1b769-57d0-4c74-9a5d-c4eca3f94231-kube-api-access-ps84d\") pod \"network-check-target-pjr62\" (UID: \"37e1b769-57d0-4c74-9a5d-c4eca3f94231\") " pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:59.240605 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.240525 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:43:59.365735 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.365704 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pjr62"] Apr 22 18:43:59.368850 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:43:59.368825 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37e1b769_57d0_4c74_9a5d_c4eca3f94231.slice/crio-091d253ca24b21470f01183831e203ae220df67bb12b4f54a1c218ffff113664 WatchSource:0}: Error finding container 091d253ca24b21470f01183831e203ae220df67bb12b4f54a1c218ffff113664: Status 404 returned error can't find the container with id 091d253ca24b21470f01183831e203ae220df67bb12b4f54a1c218ffff113664 Apr 22 18:43:59.473150 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.473115 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" event={"ID":"cef89a60-b37e-40b5-8a2e-0a715b7661ce","Type":"ContainerStarted","Data":"17ff283ba21685193c3d3ce12c8a592e3c284d79fc5477292f95ee407ee9b938"} Apr 22 18:43:59.475806 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.475153 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" event={"ID":"f083405e-bfba-4773-8229-17a3c99d04cb","Type":"ContainerStarted","Data":"b9d1ace34ab764a60e714f04a96e052a6da1c2fc303d00b195bc35db873a63b4"} Apr 22 18:43:59.478620 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.478594 2578 generic.go:358] "Generic (PLEG): container finished" podID="09c19634-8110-452e-9a84-963e44013755" containerID="8636ef8fed100b2a66a3ef9f2e185aa872d6d602f3d4792435eedec8e546d9e6" exitCode=0 Apr 22 18:43:59.478750 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.478665 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jvwbm" event={"ID":"09c19634-8110-452e-9a84-963e44013755","Type":"ContainerDied","Data":"8636ef8fed100b2a66a3ef9f2e185aa872d6d602f3d4792435eedec8e546d9e6"} Apr 22 18:43:59.479870 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.479848 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pjr62" event={"ID":"37e1b769-57d0-4c74-9a5d-c4eca3f94231","Type":"ContainerStarted","Data":"091d253ca24b21470f01183831e203ae220df67bb12b4f54a1c218ffff113664"} Apr 22 18:43:59.480854 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.480833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" event={"ID":"88b54b08-2047-4ef2-a2cc-91a34b8ebd53","Type":"ContainerStarted","Data":"3d3eb0f5a6d90b39182403a889ce736ffd4b1d521085dc4fb18aefecf40c2f2f"} Apr 22 18:43:59.964091 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.964044 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:43:59.964610 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.964135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:43:59.964610 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:43:59.964191 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:43:59.964610 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:59.964314 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:59.964610 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:59.964317 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:43:59.964610 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:59.964335 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7876885c-jjjr6: secret "image-registry-tls" not found Apr 22 18:43:59.964610 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:59.964377 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls podName:c89a564f-52a0-4145-8cad-2ae9ecec0329 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:01.964358232 +0000 UTC m=+37.166079111 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls") pod "dns-default-vkwg8" (UID: "c89a564f-52a0-4145-8cad-2ae9ecec0329") : secret "dns-default-metrics-tls" not found Apr 22 18:43:59.964610 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:59.964395 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls podName:7a0225c7-98ca-4a53-925d-9a970a77218d nodeName:}" failed. No retries permitted until 2026-04-22 18:44:01.964385587 +0000 UTC m=+37.166106439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls") pod "image-registry-5f7876885c-jjjr6" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d") : secret "image-registry-tls" not found Apr 22 18:43:59.964610 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:59.964421 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:59.964610 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:43:59.964447 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert podName:f27a9a75-c190-4880-b981-29378f194918 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:01.964437129 +0000 UTC m=+37.166157981 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert") pod "ingress-canary-btplr" (UID: "f27a9a75-c190-4880-b981-29378f194918") : secret "canary-serving-cert" not found Apr 22 18:44:01.581952 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:01.581914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:44:01.586534 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:01.586309 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/827aef15-283e-49d7-8df1-ebaee65d73aa-original-pull-secret\") pod \"global-pull-secret-syncer-q9qsg\" (UID: \"827aef15-283e-49d7-8df1-ebaee65d73aa\") " pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:44:01.644570 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:01.644529 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q9qsg" Apr 22 18:44:01.985089 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:01.984987 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:44:01.985089 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:01.985060 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:44:01.985315 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:01.985109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:44:01.985315 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:01.985258 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:44:01.985315 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:01.985273 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7876885c-jjjr6: secret "image-registry-tls" not found Apr 22 18:44:01.985460 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:01.985331 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls podName:7a0225c7-98ca-4a53-925d-9a970a77218d nodeName:}" failed. No retries permitted until 2026-04-22 18:44:05.985313098 +0000 UTC m=+41.187033949 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls") pod "image-registry-5f7876885c-jjjr6" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d") : secret "image-registry-tls" not found Apr 22 18:44:01.985684 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:01.985657 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:01.985684 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:01.985673 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:01.985862 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:01.985735 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls podName:c89a564f-52a0-4145-8cad-2ae9ecec0329 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:05.985717755 +0000 UTC m=+41.187438616 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls") pod "dns-default-vkwg8" (UID: "c89a564f-52a0-4145-8cad-2ae9ecec0329") : secret "dns-default-metrics-tls" not found Apr 22 18:44:01.985862 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:01.985754 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert podName:f27a9a75-c190-4880-b981-29378f194918 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:05.985745032 +0000 UTC m=+41.187465881 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert") pod "ingress-canary-btplr" (UID: "f27a9a75-c190-4880-b981-29378f194918") : secret "canary-serving-cert" not found Apr 22 18:44:06.016094 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:06.016050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:44:06.016557 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:06.016111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:44:06.016557 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:06.016159 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:44:06.016557 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:06.016230 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:06.016557 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:06.016287 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:06.016557 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:06.016304 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:44:06.016557 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:06.016318 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls podName:c89a564f-52a0-4145-8cad-2ae9ecec0329 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:14.01629812 +0000 UTC m=+49.218018973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls") pod "dns-default-vkwg8" (UID: "c89a564f-52a0-4145-8cad-2ae9ecec0329") : secret "dns-default-metrics-tls" not found Apr 22 18:44:06.016557 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:06.016325 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7876885c-jjjr6: secret "image-registry-tls" not found Apr 22 18:44:06.016557 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:06.016346 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert podName:f27a9a75-c190-4880-b981-29378f194918 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:14.016330891 +0000 UTC m=+49.218051738 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert") pod "ingress-canary-btplr" (UID: "f27a9a75-c190-4880-b981-29378f194918") : secret "canary-serving-cert" not found Apr 22 18:44:06.016557 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:06.016393 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls podName:7a0225c7-98ca-4a53-925d-9a970a77218d nodeName:}" failed. No retries permitted until 2026-04-22 18:44:14.016375339 +0000 UTC m=+49.218096186 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls") pod "image-registry-5f7876885c-jjjr6" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d") : secret "image-registry-tls" not found Apr 22 18:44:08.838553 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:08.838503 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q9qsg"] Apr 22 18:44:08.845200 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:44:08.845120 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod827aef15_283e_49d7_8df1_ebaee65d73aa.slice/crio-b87c9c7f9c9572074f4dbcd50eacb76a9a3dede7a03933aec1b8f472ac6e4a42 WatchSource:0}: Error finding container b87c9c7f9c9572074f4dbcd50eacb76a9a3dede7a03933aec1b8f472ac6e4a42: Status 404 returned error can't find the container with id b87c9c7f9c9572074f4dbcd50eacb76a9a3dede7a03933aec1b8f472ac6e4a42 Apr 22 18:44:09.515495 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.515403 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" event={"ID":"cef89a60-b37e-40b5-8a2e-0a715b7661ce","Type":"ContainerStarted","Data":"7dc7091cf429d592ebbbb9d500b630f712054bc2ea2a06286372f38fd7af3f23"} Apr 22 18:44:09.515661 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.515633 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:44:09.517328 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.517300 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" event={"ID":"f083405e-bfba-4773-8229-17a3c99d04cb","Type":"ContainerStarted","Data":"207ecaa13ee65460595b79885f16936cfb9a9971eaa2c98fe2580aa9093aa57c"} Apr 22 18:44:09.517902 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.517884 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:44:09.518814 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.518776 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q9qsg" event={"ID":"827aef15-283e-49d7-8df1-ebaee65d73aa","Type":"ContainerStarted","Data":"b87c9c7f9c9572074f4dbcd50eacb76a9a3dede7a03933aec1b8f472ac6e4a42"} Apr 22 18:44:09.521248 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.521221 2578 generic.go:358] "Generic (PLEG): container finished" podID="09c19634-8110-452e-9a84-963e44013755" containerID="fd12cffa291b84abd5b0a38b243a79efd3d30a7c249efe418f968af210e06452" exitCode=0 Apr 22 18:44:09.521349 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.521292 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jvwbm" event={"ID":"09c19634-8110-452e-9a84-963e44013755","Type":"ContainerDied","Data":"fd12cffa291b84abd5b0a38b243a79efd3d30a7c249efe418f968af210e06452"} Apr 22 18:44:09.522987 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.522962 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pjr62" event={"ID":"37e1b769-57d0-4c74-9a5d-c4eca3f94231","Type":"ContainerStarted","Data":"1a69c62beb326edd86771921573e7f6d55ce03534c88243e9559027e6314323a"} Apr 22 18:44:09.523256 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.523218 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:44:09.524079 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.524061 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" event={"ID":"88b54b08-2047-4ef2-a2cc-91a34b8ebd53","Type":"ContainerStarted","Data":"d63f65f6423a7695f5030ce77103af6734c9917404479bc8451a4dca981cd8f9"} Apr 22 18:44:09.533420 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.533290 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" podStartSLOduration=5.525418977 podStartE2EDuration="15.53327616s" podCreationTimestamp="2026-04-22 18:43:54 +0000 UTC" firstStartedPulling="2026-04-22 18:43:58.708024054 +0000 UTC m=+33.909744902" lastFinishedPulling="2026-04-22 18:44:08.715881224 +0000 UTC m=+43.917602085" observedRunningTime="2026-04-22 18:44:09.532757555 +0000 UTC m=+44.734478417" watchObservedRunningTime="2026-04-22 18:44:09.53327616 +0000 UTC m=+44.734997032" Apr 22 18:44:09.551251 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.551206 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pjr62" podStartSLOduration=35.206887062 podStartE2EDuration="44.55116969s" podCreationTimestamp="2026-04-22 18:43:25 +0000 UTC" firstStartedPulling="2026-04-22 18:43:59.37066533 +0000 UTC m=+34.572386177" lastFinishedPulling="2026-04-22 18:44:08.714947944 +0000 UTC m=+43.916668805" observedRunningTime="2026-04-22 18:44:09.550609127 +0000 UTC m=+44.752330002" watchObservedRunningTime="2026-04-22 18:44:09.55116969 +0000 UTC m=+44.752890551" Apr 22 18:44:09.608214 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:09.608133 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" podStartSLOduration=5.5977940109999995 podStartE2EDuration="15.608118035s" podCreationTimestamp="2026-04-22 18:43:54 +0000 UTC" firstStartedPulling="2026-04-22 18:43:58.706460571 +0000 UTC m=+33.908181419" lastFinishedPulling="2026-04-22 18:44:08.716784586 +0000 UTC m=+43.918505443" observedRunningTime="2026-04-22 18:44:09.607656046 +0000 UTC m=+44.809376918" watchObservedRunningTime="2026-04-22 18:44:09.608118035 +0000 UTC m=+44.809838916" Apr 22 18:44:10.527837 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:10.527806 2578 generic.go:358] "Generic (PLEG): container finished" podID="09c19634-8110-452e-9a84-963e44013755" containerID="6d3486d1e730cf9599837324febdb7ca51b99b9fe7bc5cbae0b124b6ec04582a" exitCode=0 Apr 22 18:44:10.528298 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:10.527934 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jvwbm" event={"ID":"09c19634-8110-452e-9a84-963e44013755","Type":"ContainerDied","Data":"6d3486d1e730cf9599837324febdb7ca51b99b9fe7bc5cbae0b124b6ec04582a"} Apr 22 18:44:11.533852 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:11.533812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jvwbm" event={"ID":"09c19634-8110-452e-9a84-963e44013755","Type":"ContainerStarted","Data":"dc2bea2314686491b71b5d98292a56e8d7f4d4769819db9e0e33f8cd74577bdd"} Apr 22 18:44:11.561481 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:11.561412 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jvwbm" podStartSLOduration=5.7903827329999995 podStartE2EDuration="46.561392364s" podCreationTimestamp="2026-04-22 18:43:25 +0000 UTC" firstStartedPulling="2026-04-22 18:43:27.968292767 +0000 UTC m=+3.170013620" lastFinishedPulling="2026-04-22 18:44:08.739302393 +0000 UTC m=+43.941023251" observedRunningTime="2026-04-22 18:44:11.556885175 +0000 UTC m=+46.758606047" watchObservedRunningTime="2026-04-22 18:44:11.561392364 +0000 UTC m=+46.763113233" Apr 22 18:44:14.076255 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:14.076215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:44:14.076649 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:14.076269 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:44:14.076649 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:14.076300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:44:14.076649 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:14.076369 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:14.076649 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:14.076387 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:44:14.076649 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:14.076399 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7876885c-jjjr6: secret "image-registry-tls" not found Apr 22 18:44:14.076649 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:14.076412 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:14.076649 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:14.076440 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert podName:f27a9a75-c190-4880-b981-29378f194918 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:30.07642379 +0000 UTC m=+65.278144637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert") pod "ingress-canary-btplr" (UID: "f27a9a75-c190-4880-b981-29378f194918") : secret "canary-serving-cert" not found Apr 22 18:44:14.076649 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:14.076460 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls podName:c89a564f-52a0-4145-8cad-2ae9ecec0329 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:30.076446862 +0000 UTC m=+65.278167709 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls") pod "dns-default-vkwg8" (UID: "c89a564f-52a0-4145-8cad-2ae9ecec0329") : secret "dns-default-metrics-tls" not found Apr 22 18:44:14.076649 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:14.076473 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls podName:7a0225c7-98ca-4a53-925d-9a970a77218d nodeName:}" failed. No retries permitted until 2026-04-22 18:44:30.076467224 +0000 UTC m=+65.278188071 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls") pod "image-registry-5f7876885c-jjjr6" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d") : secret "image-registry-tls" not found Apr 22 18:44:14.543192 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:14.543141 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q9qsg" event={"ID":"827aef15-283e-49d7-8df1-ebaee65d73aa","Type":"ContainerStarted","Data":"9fdd209e0e933bd3f3305cf65d8318a242cba9c7fc1354b55236a49fa25cd814"} Apr 22 18:44:14.544776 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:14.544754 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" event={"ID":"88b54b08-2047-4ef2-a2cc-91a34b8ebd53","Type":"ContainerStarted","Data":"423286d182507a6ad33ce2ff3fc987dfa77a000ee975cca3d9b374f367d6d085"} Apr 22 18:44:14.544872 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:14.544785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" event={"ID":"88b54b08-2047-4ef2-a2cc-91a34b8ebd53","Type":"ContainerStarted","Data":"e0bfbe576f7f7c53ad993a2428c3d63e0eae116c6e9e883e24f4cd68a98df17a"} Apr 22 18:44:14.558882 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:14.558833 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-q9qsg" podStartSLOduration=40.847430845 podStartE2EDuration="45.558820727s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:44:08.847236217 +0000 UTC m=+44.048957066" lastFinishedPulling="2026-04-22 18:44:13.558626088 +0000 UTC m=+48.760346948" observedRunningTime="2026-04-22 18:44:14.557965786 +0000 UTC m=+49.759686656" watchObservedRunningTime="2026-04-22 18:44:14.558820727 +0000 UTC m=+49.760541595" Apr 22 18:44:14.580869 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:14.580828 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" podStartSLOduration=5.7332603330000005 podStartE2EDuration="20.580817111s" podCreationTimestamp="2026-04-22 18:43:54 +0000 UTC" firstStartedPulling="2026-04-22 18:43:58.70814396 +0000 UTC m=+33.909864825" lastFinishedPulling="2026-04-22 18:44:13.555700756 +0000 UTC m=+48.757421603" observedRunningTime="2026-04-22 18:44:14.580236717 +0000 UTC m=+49.781957578" watchObservedRunningTime="2026-04-22 18:44:14.580817111 +0000 UTC m=+49.782537979" Apr 22 18:44:23.473220 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:23.473190 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mm8rp" Apr 22 18:44:30.093001 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:30.092944 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:44:30.093481 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:30.093021 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:44:30.093481 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:30.093074 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:44:30.093481 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:30.093095 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:30.093481 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:30.093157 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls podName:c89a564f-52a0-4145-8cad-2ae9ecec0329 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:02.093141879 +0000 UTC m=+97.294862726 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls") pod "dns-default-vkwg8" (UID: "c89a564f-52a0-4145-8cad-2ae9ecec0329") : secret "dns-default-metrics-tls" not found Apr 22 18:44:30.093481 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:30.093156 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:44:30.093481 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:30.093194 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7876885c-jjjr6: secret "image-registry-tls" not found Apr 22 18:44:30.093481 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:30.093166 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:30.093481 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:30.093234 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls podName:7a0225c7-98ca-4a53-925d-9a970a77218d nodeName:}" failed. No retries permitted until 2026-04-22 18:45:02.09322204 +0000 UTC m=+97.294942892 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls") pod "image-registry-5f7876885c-jjjr6" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d") : secret "image-registry-tls" not found Apr 22 18:44:30.093481 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:30.093247 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert podName:f27a9a75-c190-4880-b981-29378f194918 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:02.093241289 +0000 UTC m=+97.294962136 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert") pod "ingress-canary-btplr" (UID: "f27a9a75-c190-4880-b981-29378f194918") : secret "canary-serving-cert" not found Apr 22 18:44:31.000447 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:31.000403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:44:31.000622 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:31.000560 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:44:31.000665 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:44:31.000622 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs podName:111ee8c4-f2a7-4e7b-8faf-15392cc75774 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:35.000605285 +0000 UTC m=+130.202326148 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs") pod "network-metrics-daemon-gzjvx" (UID: "111ee8c4-f2a7-4e7b-8faf-15392cc75774") : secret "metrics-daemon-secret" not found Apr 22 18:44:40.531406 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:44:40.531372 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pjr62" Apr 22 18:45:02.132966 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:02.132926 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:45:02.133398 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:02.132986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:45:02.133398 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:02.133015 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:45:02.133398 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:45:02.133089 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:45:02.133398 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:45:02.133101 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:45:02.133398 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:45:02.133118 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:45:02.133398 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:45:02.133126 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f7876885c-jjjr6: secret "image-registry-tls" not found Apr 22 18:45:02.133398 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:45:02.133169 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert podName:f27a9a75-c190-4880-b981-29378f194918 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:06.133147737 +0000 UTC m=+161.334868595 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert") pod "ingress-canary-btplr" (UID: "f27a9a75-c190-4880-b981-29378f194918") : secret "canary-serving-cert" not found Apr 22 18:45:02.133398 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:45:02.133207 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls podName:c89a564f-52a0-4145-8cad-2ae9ecec0329 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:06.133199829 +0000 UTC m=+161.334920680 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls") pod "dns-default-vkwg8" (UID: "c89a564f-52a0-4145-8cad-2ae9ecec0329") : secret "dns-default-metrics-tls" not found Apr 22 18:45:02.133398 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:45:02.133222 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls podName:7a0225c7-98ca-4a53-925d-9a970a77218d nodeName:}" failed. No retries permitted until 2026-04-22 18:46:06.133213761 +0000 UTC m=+161.334934610 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls") pod "image-registry-5f7876885c-jjjr6" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d") : secret "image-registry-tls" not found Apr 22 18:45:35.072052 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:35.072006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:45:35.072561 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:45:35.072140 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:45:35.072561 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:45:35.072225 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs podName:111ee8c4-f2a7-4e7b-8faf-15392cc75774 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:37.072209867 +0000 UTC m=+252.273930715 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs") pod "network-metrics-daemon-gzjvx" (UID: "111ee8c4-f2a7-4e7b-8faf-15392cc75774") : secret "metrics-daemon-secret" not found Apr 22 18:45:35.659514 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:35.659486 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b9bb8_09d3f614-cd83-4e5b-8bb6-06b778d0eda3/dns-node-resolver/0.log" Apr 22 18:45:36.266324 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:36.266296 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6268s_a42ca4fd-ca90-4584-971c-d1d61ff097f6/node-ca/0.log" Apr 22 18:45:48.548275 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:48.548214 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" podUID="88b54b08-2047-4ef2-a2cc-91a34b8ebd53" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:45:58.548825 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.548784 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" podUID="88b54b08-2047-4ef2-a2cc-91a34b8ebd53" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:45:58.895683 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.895613 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-87l95"] Apr 22 18:45:58.898600 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.898582 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:58.901004 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.900983 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:45:58.902168 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.902148 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:45:58.902272 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.902149 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zv9q9\"" Apr 22 18:45:58.902272 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.902148 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:45:58.902272 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.902158 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:45:58.924474 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.924445 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-87l95"] Apr 22 18:45:58.941722 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.941694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/19bf46ad-21ba-4f5c-94db-81d68fd09368-data-volume\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:58.941852 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.941742 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/19bf46ad-21ba-4f5c-94db-81d68fd09368-crio-socket\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:58.941852 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.941771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wn27\" (UniqueName: \"kubernetes.io/projected/19bf46ad-21ba-4f5c-94db-81d68fd09368-kube-api-access-2wn27\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:58.941925 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.941850 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/19bf46ad-21ba-4f5c-94db-81d68fd09368-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:58.941925 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:58.941912 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19bf46ad-21ba-4f5c-94db-81d68fd09368-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:59.042841 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.042804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/19bf46ad-21ba-4f5c-94db-81d68fd09368-data-volume\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:59.043024 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.042855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/19bf46ad-21ba-4f5c-94db-81d68fd09368-crio-socket\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:59.043024 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.042887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wn27\" (UniqueName: \"kubernetes.io/projected/19bf46ad-21ba-4f5c-94db-81d68fd09368-kube-api-access-2wn27\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:59.043024 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.042919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/19bf46ad-21ba-4f5c-94db-81d68fd09368-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:59.043024 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.042960 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/19bf46ad-21ba-4f5c-94db-81d68fd09368-crio-socket\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:59.043024 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.042978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19bf46ad-21ba-4f5c-94db-81d68fd09368-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:59.043295 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.043241 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/19bf46ad-21ba-4f5c-94db-81d68fd09368-data-volume\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:59.043439 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.043422 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/19bf46ad-21ba-4f5c-94db-81d68fd09368-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:59.045522 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.045507 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/19bf46ad-21ba-4f5c-94db-81d68fd09368-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:59.066499 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.066477 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wn27\" (UniqueName: \"kubernetes.io/projected/19bf46ad-21ba-4f5c-94db-81d68fd09368-kube-api-access-2wn27\") pod \"insights-runtime-extractor-87l95\" (UID: \"19bf46ad-21ba-4f5c-94db-81d68fd09368\") " pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:59.207236 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.207134 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-87l95" Apr 22 18:45:59.337497 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:45:59.337454 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19bf46ad_21ba_4f5c_94db_81d68fd09368.slice/crio-53df7d33de164cb288eacb2f90c2b325f04a58ec920f205e70689feb38f5b529 WatchSource:0}: Error finding container 53df7d33de164cb288eacb2f90c2b325f04a58ec920f205e70689feb38f5b529: Status 404 returned error can't find the container with id 53df7d33de164cb288eacb2f90c2b325f04a58ec920f205e70689feb38f5b529 Apr 22 18:45:59.337952 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.337930 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-87l95"] Apr 22 18:45:59.791309 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.791263 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87l95" event={"ID":"19bf46ad-21ba-4f5c-94db-81d68fd09368","Type":"ContainerStarted","Data":"994ff15abff127cf1fe7ce852d4d769ed7cd2330d8b686da36c057604b1963e5"} Apr 22 18:45:59.791309 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:45:59.791311 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87l95" event={"ID":"19bf46ad-21ba-4f5c-94db-81d68fd09368","Type":"ContainerStarted","Data":"53df7d33de164cb288eacb2f90c2b325f04a58ec920f205e70689feb38f5b529"} Apr 22 18:46:00.799948 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:00.799915 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87l95" event={"ID":"19bf46ad-21ba-4f5c-94db-81d68fd09368","Type":"ContainerStarted","Data":"290c5f9f6ed8eea0d898392104072107b5d1650ba35fb2020de9562ffc1483ef"} Apr 22 18:46:01.235487 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:46:01.235396 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" podUID="7a0225c7-98ca-4a53-925d-9a970a77218d" Apr 22 18:46:01.265817 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:46:01.265769 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vkwg8" podUID="c89a564f-52a0-4145-8cad-2ae9ecec0329" Apr 22 18:46:01.270438 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:46:01.270408 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-btplr" podUID="f27a9a75-c190-4880-b981-29378f194918" Apr 22 18:46:01.334905 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:46:01.334861 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-gzjvx" podUID="111ee8c4-f2a7-4e7b-8faf-15392cc75774" Apr 22 18:46:01.806738 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:01.806712 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:46:01.806738 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:01.806726 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87l95" event={"ID":"19bf46ad-21ba-4f5c-94db-81d68fd09368","Type":"ContainerStarted","Data":"712bebfbd30f9a7767946fcf2007e8dbbfe33e8f6463560d51a5e6a3e2890dcf"} Apr 22 18:46:01.807230 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:01.806987 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vkwg8" Apr 22 18:46:01.826028 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:01.825983 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-87l95" podStartSLOduration=1.7992625439999999 podStartE2EDuration="3.82597318s" podCreationTimestamp="2026-04-22 18:45:58 +0000 UTC" firstStartedPulling="2026-04-22 18:45:59.388432292 +0000 UTC m=+154.590153142" lastFinishedPulling="2026-04-22 18:46:01.415142931 +0000 UTC m=+156.616863778" observedRunningTime="2026-04-22 18:46:01.825070053 +0000 UTC m=+157.026790922" watchObservedRunningTime="2026-04-22 18:46:01.82597318 +0000 UTC m=+157.027694047" Apr 22 18:46:06.199447 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.199413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:46:06.199447 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.199452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:46:06.199933 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.199487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:46:06.201907 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.201884 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c89a564f-52a0-4145-8cad-2ae9ecec0329-metrics-tls\") pod \"dns-default-vkwg8\" (UID: \"c89a564f-52a0-4145-8cad-2ae9ecec0329\") " pod="openshift-dns/dns-default-vkwg8" Apr 22 18:46:06.202030 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.202012 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f27a9a75-c190-4880-b981-29378f194918-cert\") pod \"ingress-canary-btplr\" (UID: \"f27a9a75-c190-4880-b981-29378f194918\") " pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:46:06.202092 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.202026 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") pod \"image-registry-5f7876885c-jjjr6\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:46:06.310870 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.310833 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xcpp5\"" Apr 22 18:46:06.310870 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.310832 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nnjgq\"" Apr 22 18:46:06.318779 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.318757 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:46:06.318843 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.318785 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vkwg8" Apr 22 18:46:06.450692 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.450621 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f7876885c-jjjr6"] Apr 22 18:46:06.454560 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:46:06.454534 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a0225c7_98ca_4a53_925d_9a970a77218d.slice/crio-9eba0f524b71f1bc6d7efca7b647bf10f6d77fc7a8f02c0d9d47fcf37643ccf0 WatchSource:0}: Error finding container 9eba0f524b71f1bc6d7efca7b647bf10f6d77fc7a8f02c0d9d47fcf37643ccf0: Status 404 returned error can't find the container with id 9eba0f524b71f1bc6d7efca7b647bf10f6d77fc7a8f02c0d9d47fcf37643ccf0 Apr 22 18:46:06.465635 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.464219 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vkwg8"] Apr 22 18:46:06.469069 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:46:06.469045 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc89a564f_52a0_4145_8cad_2ae9ecec0329.slice/crio-d932e298d952879c59e9966a54e52266ab85188d8f113ae6497e5fe02452ecd7 WatchSource:0}: Error finding container d932e298d952879c59e9966a54e52266ab85188d8f113ae6497e5fe02452ecd7: Status 404 returned error can't find the container with id d932e298d952879c59e9966a54e52266ab85188d8f113ae6497e5fe02452ecd7 Apr 22 18:46:06.820435 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.820389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vkwg8" event={"ID":"c89a564f-52a0-4145-8cad-2ae9ecec0329","Type":"ContainerStarted","Data":"d932e298d952879c59e9966a54e52266ab85188d8f113ae6497e5fe02452ecd7"} Apr 22 18:46:06.821650 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.821620 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" event={"ID":"7a0225c7-98ca-4a53-925d-9a970a77218d","Type":"ContainerStarted","Data":"916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb"} Apr 22 18:46:06.821756 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.821654 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" event={"ID":"7a0225c7-98ca-4a53-925d-9a970a77218d","Type":"ContainerStarted","Data":"9eba0f524b71f1bc6d7efca7b647bf10f6d77fc7a8f02c0d9d47fcf37643ccf0"} Apr 22 18:46:06.821880 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.821860 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:46:06.843864 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:06.843821 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" podStartSLOduration=161.843807116 podStartE2EDuration="2m41.843807116s" podCreationTimestamp="2026-04-22 18:43:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:06.84228682 +0000 UTC m=+162.044007689" watchObservedRunningTime="2026-04-22 18:46:06.843807116 +0000 UTC m=+162.045527984" Apr 22 18:46:07.417113 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.417081 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-sb4w9"] Apr 22 18:46:07.420375 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.420349 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.422432 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.422407 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:46:07.423652 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.423614 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:46:07.423652 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.423646 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:46:07.423861 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.423844 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:46:07.423936 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.423869 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xvkjw\"" Apr 22 18:46:07.423936 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.423881 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:46:07.424030 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.423979 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:46:07.510110 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.510080 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/51a228a5-6cb1-4fe6-a0bd-b497349eee85-root\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.510110 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.510115 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/51a228a5-6cb1-4fe6-a0bd-b497349eee85-sys\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.510344 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.510137 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz8mh\" (UniqueName: \"kubernetes.io/projected/51a228a5-6cb1-4fe6-a0bd-b497349eee85-kube-api-access-cz8mh\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.510344 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.510252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-wtmp\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.510344 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.510311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.510467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.510359 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51a228a5-6cb1-4fe6-a0bd-b497349eee85-metrics-client-ca\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.510467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.510381 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-tls\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.510467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.510400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-textfile\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.510467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.510422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-accelerators-collector-config\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.610953 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.610922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-wtmp\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611138 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.610965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611138 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.610999 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51a228a5-6cb1-4fe6-a0bd-b497349eee85-metrics-client-ca\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611138 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.611023 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-tls\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611138 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.611052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-textfile\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611138 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.611075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-accelerators-collector-config\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611138 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.611135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/51a228a5-6cb1-4fe6-a0bd-b497349eee85-root\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611449 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.611167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/51a228a5-6cb1-4fe6-a0bd-b497349eee85-sys\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611449 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.611218 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz8mh\" (UniqueName: \"kubernetes.io/projected/51a228a5-6cb1-4fe6-a0bd-b497349eee85-kube-api-access-cz8mh\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611449 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.611132 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-wtmp\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611449 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.611304 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/51a228a5-6cb1-4fe6-a0bd-b497349eee85-root\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611449 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:46:07.611222 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:46:07.611449 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:46:07.611375 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-tls podName:51a228a5-6cb1-4fe6-a0bd-b497349eee85 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:08.111355951 +0000 UTC m=+163.313076811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-tls") pod "node-exporter-sb4w9" (UID: "51a228a5-6cb1-4fe6-a0bd-b497349eee85") : secret "node-exporter-tls" not found Apr 22 18:46:07.611681 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.611504 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/51a228a5-6cb1-4fe6-a0bd-b497349eee85-sys\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611729 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.611683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51a228a5-6cb1-4fe6-a0bd-b497349eee85-metrics-client-ca\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611729 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.611690 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-accelerators-collector-config\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.611729 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.611713 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-textfile\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.614033 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.614009 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:07.623819 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:07.623787 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz8mh\" (UniqueName: \"kubernetes.io/projected/51a228a5-6cb1-4fe6-a0bd-b497349eee85-kube-api-access-cz8mh\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:08.115645 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.115558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-tls\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:08.118040 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.118008 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/51a228a5-6cb1-4fe6-a0bd-b497349eee85-node-exporter-tls\") pod \"node-exporter-sb4w9\" (UID: \"51a228a5-6cb1-4fe6-a0bd-b497349eee85\") " pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:08.331806 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.331767 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sb4w9" Apr 22 18:46:08.339546 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:46:08.339516 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a228a5_6cb1_4fe6_a0bd_b497349eee85.slice/crio-0028aa27419c43b17867900c1c0962b09de24ecc1c79b6afd34b1972f123b771 WatchSource:0}: Error finding container 0028aa27419c43b17867900c1c0962b09de24ecc1c79b6afd34b1972f123b771: Status 404 returned error can't find the container with id 0028aa27419c43b17867900c1c0962b09de24ecc1c79b6afd34b1972f123b771 Apr 22 18:46:08.548813 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.548775 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" podUID="88b54b08-2047-4ef2-a2cc-91a34b8ebd53" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:46:08.549162 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.548843 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" Apr 22 18:46:08.549345 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.549315 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"423286d182507a6ad33ce2ff3fc987dfa77a000ee975cca3d9b374f367d6d085"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 18:46:08.549389 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.549378 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" podUID="88b54b08-2047-4ef2-a2cc-91a34b8ebd53" containerName="service-proxy" containerID="cri-o://423286d182507a6ad33ce2ff3fc987dfa77a000ee975cca3d9b374f367d6d085" gracePeriod=30 Apr 22 18:46:08.831013 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.830913 2578 generic.go:358] "Generic (PLEG): container finished" podID="88b54b08-2047-4ef2-a2cc-91a34b8ebd53" containerID="423286d182507a6ad33ce2ff3fc987dfa77a000ee975cca3d9b374f367d6d085" exitCode=2 Apr 22 18:46:08.831013 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.830992 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" event={"ID":"88b54b08-2047-4ef2-a2cc-91a34b8ebd53","Type":"ContainerDied","Data":"423286d182507a6ad33ce2ff3fc987dfa77a000ee975cca3d9b374f367d6d085"} Apr 22 18:46:08.831254 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.831036 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5864889ff5-n5bc7" event={"ID":"88b54b08-2047-4ef2-a2cc-91a34b8ebd53","Type":"ContainerStarted","Data":"e09a4e3634ab4b82d049c08b0808f9053e9b91018cd51cd1cec1743821a7a2a5"} Apr 22 18:46:08.832282 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.832253 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sb4w9" event={"ID":"51a228a5-6cb1-4fe6-a0bd-b497349eee85","Type":"ContainerStarted","Data":"0028aa27419c43b17867900c1c0962b09de24ecc1c79b6afd34b1972f123b771"} Apr 22 18:46:08.834120 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.834086 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vkwg8" event={"ID":"c89a564f-52a0-4145-8cad-2ae9ecec0329","Type":"ContainerStarted","Data":"fb464adaa33abe248d7d43da1fde49cdab561ea051f84bb04a2d761975ca17cb"} Apr 22 18:46:08.834120 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.834118 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vkwg8" event={"ID":"c89a564f-52a0-4145-8cad-2ae9ecec0329","Type":"ContainerStarted","Data":"22f96ddacbd70005621dd407add689b56fce4c9e7f15700d75c439fbabe82d68"} Apr 22 18:46:08.834324 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.834303 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vkwg8" Apr 22 18:46:08.872069 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:08.872015 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vkwg8" podStartSLOduration=129.484251634 podStartE2EDuration="2m10.871999819s" podCreationTimestamp="2026-04-22 18:43:58 +0000 UTC" firstStartedPulling="2026-04-22 18:46:06.470646114 +0000 UTC m=+161.672366961" lastFinishedPulling="2026-04-22 18:46:07.858394294 +0000 UTC m=+163.060115146" observedRunningTime="2026-04-22 18:46:08.871132426 +0000 UTC m=+164.072853296" watchObservedRunningTime="2026-04-22 18:46:08.871999819 +0000 UTC m=+164.073720687" Apr 22 18:46:09.516884 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:09.516792 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" podUID="cef89a60-b37e-40b5-8a2e-0a715b7661ce" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.8:8000/readyz\": dial tcp 10.132.0.8:8000: connect: connection refused" Apr 22 18:46:09.837718 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:09.837678 2578 generic.go:358] "Generic (PLEG): container finished" podID="cef89a60-b37e-40b5-8a2e-0a715b7661ce" containerID="7dc7091cf429d592ebbbb9d500b630f712054bc2ea2a06286372f38fd7af3f23" exitCode=1 Apr 22 18:46:09.838146 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:09.837754 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" event={"ID":"cef89a60-b37e-40b5-8a2e-0a715b7661ce","Type":"ContainerDied","Data":"7dc7091cf429d592ebbbb9d500b630f712054bc2ea2a06286372f38fd7af3f23"} Apr 22 18:46:09.838146 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:09.838093 2578 scope.go:117] "RemoveContainer" containerID="7dc7091cf429d592ebbbb9d500b630f712054bc2ea2a06286372f38fd7af3f23" Apr 22 18:46:09.839121 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:09.839100 2578 generic.go:358] "Generic (PLEG): container finished" podID="f083405e-bfba-4773-8229-17a3c99d04cb" containerID="207ecaa13ee65460595b79885f16936cfb9a9971eaa2c98fe2580aa9093aa57c" exitCode=255 Apr 22 18:46:09.839245 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:09.839195 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" event={"ID":"f083405e-bfba-4773-8229-17a3c99d04cb","Type":"ContainerDied","Data":"207ecaa13ee65460595b79885f16936cfb9a9971eaa2c98fe2580aa9093aa57c"} Apr 22 18:46:09.839504 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:09.839488 2578 scope.go:117] "RemoveContainer" containerID="207ecaa13ee65460595b79885f16936cfb9a9971eaa2c98fe2580aa9093aa57c" Apr 22 18:46:09.840669 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:09.840652 2578 generic.go:358] "Generic (PLEG): container finished" podID="51a228a5-6cb1-4fe6-a0bd-b497349eee85" containerID="e239888b978e1b0e04d4334823617cf0848ababb659b621cce4459d29a8cbc38" exitCode=0 Apr 22 18:46:09.840741 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:09.840719 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sb4w9" event={"ID":"51a228a5-6cb1-4fe6-a0bd-b497349eee85","Type":"ContainerDied","Data":"e239888b978e1b0e04d4334823617cf0848ababb659b621cce4459d29a8cbc38"} Apr 22 18:46:10.845321 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:10.845286 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sb4w9" event={"ID":"51a228a5-6cb1-4fe6-a0bd-b497349eee85","Type":"ContainerStarted","Data":"28b8c50e92e32783fb39db6d15cfbe01dea3b510f65368f7b0fbfc4c0074c1e7"} Apr 22 18:46:10.845321 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:10.845319 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sb4w9" event={"ID":"51a228a5-6cb1-4fe6-a0bd-b497349eee85","Type":"ContainerStarted","Data":"9f46d780a65f3fcd6ca5f9ad26d1cec4d4a1c3c04dceb3b3dbd3c2fb0597047f"} Apr 22 18:46:10.846858 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:10.846839 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" event={"ID":"cef89a60-b37e-40b5-8a2e-0a715b7661ce","Type":"ContainerStarted","Data":"0f9ce4cf3746056c9346edf33af621d9031a23313111054a1a1dea9b8abd71f0"} Apr 22 18:46:10.847110 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:10.847087 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:46:10.847729 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:10.847710 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7d588db7bf-8v7mz" Apr 22 18:46:10.848589 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:10.848572 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7766cfd6d7-24vwc" event={"ID":"f083405e-bfba-4773-8229-17a3c99d04cb","Type":"ContainerStarted","Data":"fd838f30ea04bd8a5db83f05395a39b148d3a249301fcfab91b2e2407533fe8c"} Apr 22 18:46:10.864915 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:10.864878 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-sb4w9" podStartSLOduration=3.085270902 podStartE2EDuration="3.864866215s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:08.34114651 +0000 UTC m=+163.542867357" lastFinishedPulling="2026-04-22 18:46:09.120741822 +0000 UTC m=+164.322462670" observedRunningTime="2026-04-22 18:46:10.863702549 +0000 UTC m=+166.065423418" watchObservedRunningTime="2026-04-22 18:46:10.864866215 +0000 UTC m=+166.066587084" Apr 22 18:46:12.325148 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:12.325051 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:46:13.324829 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:13.324788 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:46:13.327823 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:13.327797 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xmncv\"" Apr 22 18:46:13.335126 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:13.335111 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-btplr" Apr 22 18:46:13.449198 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:13.449149 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-btplr"] Apr 22 18:46:13.451973 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:46:13.451948 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf27a9a75_c190_4880_b981_29378f194918.slice/crio-8af9c1ae75b5cb74fcb6e6ddfd4e2d2c2661631a882c1581204f7f1fa4283462 WatchSource:0}: Error finding container 8af9c1ae75b5cb74fcb6e6ddfd4e2d2c2661631a882c1581204f7f1fa4283462: Status 404 returned error can't find the container with id 8af9c1ae75b5cb74fcb6e6ddfd4e2d2c2661631a882c1581204f7f1fa4283462 Apr 22 18:46:13.861355 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:13.861305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-btplr" event={"ID":"f27a9a75-c190-4880-b981-29378f194918","Type":"ContainerStarted","Data":"8af9c1ae75b5cb74fcb6e6ddfd4e2d2c2661631a882c1581204f7f1fa4283462"} Apr 22 18:46:15.869368 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:15.869332 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-btplr" event={"ID":"f27a9a75-c190-4880-b981-29378f194918","Type":"ContainerStarted","Data":"a90582c51092beed7142f72c88ed21fd184f9af8bf20dbd7f4d123eac1cd098e"} Apr 22 18:46:15.886886 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:15.886837 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-btplr" podStartSLOduration=136.465533077 podStartE2EDuration="2m17.886824634s" podCreationTimestamp="2026-04-22 18:43:58 +0000 UTC" firstStartedPulling="2026-04-22 18:46:13.453733056 +0000 UTC m=+168.655453902" lastFinishedPulling="2026-04-22 18:46:14.875024609 +0000 UTC m=+170.076745459" observedRunningTime="2026-04-22 18:46:15.886097372 +0000 UTC m=+171.087818241" watchObservedRunningTime="2026-04-22 18:46:15.886824634 +0000 UTC m=+171.088545503" Apr 22 18:46:18.843049 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:18.843017 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vkwg8" Apr 22 18:46:20.977022 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:20.976993 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f7876885c-jjjr6"] Apr 22 18:46:30.982066 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:30.982032 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:46:45.995859 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:45.995795 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" podUID="7a0225c7-98ca-4a53-925d-9a970a77218d" containerName="registry" containerID="cri-o://916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb" gracePeriod=30 Apr 22 18:46:46.223234 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.223211 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:46:46.406657 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.406613 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a0225c7-98ca-4a53-925d-9a970a77218d-trusted-ca\") pod \"7a0225c7-98ca-4a53-925d-9a970a77218d\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " Apr 22 18:46:46.406657 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.406660 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-certificates\") pod \"7a0225c7-98ca-4a53-925d-9a970a77218d\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " Apr 22 18:46:46.406916 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.406679 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7a0225c7-98ca-4a53-925d-9a970a77218d-image-registry-private-configuration\") pod \"7a0225c7-98ca-4a53-925d-9a970a77218d\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " Apr 22 18:46:46.406916 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.406704 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9t69\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-kube-api-access-g9t69\") pod \"7a0225c7-98ca-4a53-925d-9a970a77218d\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " Apr 22 18:46:46.406916 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.406799 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") pod \"7a0225c7-98ca-4a53-925d-9a970a77218d\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " Apr 22 18:46:46.406916 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.406853 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a0225c7-98ca-4a53-925d-9a970a77218d-installation-pull-secrets\") pod \"7a0225c7-98ca-4a53-925d-9a970a77218d\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " Apr 22 18:46:46.407119 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.407042 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a0225c7-98ca-4a53-925d-9a970a77218d-ca-trust-extracted\") pod \"7a0225c7-98ca-4a53-925d-9a970a77218d\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " Apr 22 18:46:46.407119 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.407076 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7a0225c7-98ca-4a53-925d-9a970a77218d" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:46.407268 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.407130 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-bound-sa-token\") pod \"7a0225c7-98ca-4a53-925d-9a970a77218d\" (UID: \"7a0225c7-98ca-4a53-925d-9a970a77218d\") " Apr 22 18:46:46.407268 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.407142 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0225c7-98ca-4a53-925d-9a970a77218d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7a0225c7-98ca-4a53-925d-9a970a77218d" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:46.407398 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.407377 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a0225c7-98ca-4a53-925d-9a970a77218d-trusted-ca\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 22 18:46:46.407466 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.407403 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-certificates\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 22 18:46:46.409345 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.409312 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7a0225c7-98ca-4a53-925d-9a970a77218d" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:46.409467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.409386 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0225c7-98ca-4a53-925d-9a970a77218d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7a0225c7-98ca-4a53-925d-9a970a77218d" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:46.409627 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.409602 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0225c7-98ca-4a53-925d-9a970a77218d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "7a0225c7-98ca-4a53-925d-9a970a77218d" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:46.409692 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.409625 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-kube-api-access-g9t69" (OuterVolumeSpecName: "kube-api-access-g9t69") pod "7a0225c7-98ca-4a53-925d-9a970a77218d" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d"). InnerVolumeSpecName "kube-api-access-g9t69". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:46.409936 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.409910 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7a0225c7-98ca-4a53-925d-9a970a77218d" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:46.415903 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.415878 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a0225c7-98ca-4a53-925d-9a970a77218d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7a0225c7-98ca-4a53-925d-9a970a77218d" (UID: "7a0225c7-98ca-4a53-925d-9a970a77218d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:46:46.507922 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.507878 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a0225c7-98ca-4a53-925d-9a970a77218d-ca-trust-extracted\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 22 18:46:46.507922 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.507920 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-bound-sa-token\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 22 18:46:46.507922 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.507932 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7a0225c7-98ca-4a53-925d-9a970a77218d-image-registry-private-configuration\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 22 18:46:46.508163 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.507943 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9t69\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-kube-api-access-g9t69\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 22 18:46:46.508163 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.507952 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a0225c7-98ca-4a53-925d-9a970a77218d-registry-tls\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 22 18:46:46.508163 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.507962 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a0225c7-98ca-4a53-925d-9a970a77218d-installation-pull-secrets\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 22 18:46:46.958225 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.958191 2578 generic.go:358] "Generic (PLEG): container finished" podID="7a0225c7-98ca-4a53-925d-9a970a77218d" containerID="916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb" exitCode=0 Apr 22 18:46:46.958421 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.958263 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" Apr 22 18:46:46.958421 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.958264 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" event={"ID":"7a0225c7-98ca-4a53-925d-9a970a77218d","Type":"ContainerDied","Data":"916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb"} Apr 22 18:46:46.958421 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.958370 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f7876885c-jjjr6" event={"ID":"7a0225c7-98ca-4a53-925d-9a970a77218d","Type":"ContainerDied","Data":"9eba0f524b71f1bc6d7efca7b647bf10f6d77fc7a8f02c0d9d47fcf37643ccf0"} Apr 22 18:46:46.958421 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.958384 2578 scope.go:117] "RemoveContainer" containerID="916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb" Apr 22 18:46:46.966805 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.966775 2578 scope.go:117] "RemoveContainer" containerID="916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb" Apr 22 18:46:46.967065 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:46:46.967047 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb\": container with ID starting with 916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb not found: ID does not exist" containerID="916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb" Apr 22 18:46:46.967122 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.967075 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb"} err="failed to get container status \"916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb\": rpc error: code = NotFound desc = could not find container \"916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb\": container with ID starting with 916766cc810f9ba72b6bb9e87682e0b10cde21598be65102344a741b96e728bb not found: ID does not exist" Apr 22 18:46:46.983815 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.983775 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f7876885c-jjjr6"] Apr 22 18:46:46.990875 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:46.990847 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5f7876885c-jjjr6"] Apr 22 18:46:47.328698 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:47.328665 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0225c7-98ca-4a53-925d-9a970a77218d" path="/var/lib/kubelet/pods/7a0225c7-98ca-4a53-925d-9a970a77218d/volumes" Apr 22 18:46:54.092267 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:46:54.092238 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-btplr_f27a9a75-c190-4880-b981-29378f194918/serve-healthcheck-canary/0.log" Apr 22 18:47:37.161195 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:47:37.161129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:47:37.163566 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:47:37.163546 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/111ee8c4-f2a7-4e7b-8faf-15392cc75774-metrics-certs\") pod \"network-metrics-daemon-gzjvx\" (UID: \"111ee8c4-f2a7-4e7b-8faf-15392cc75774\") " pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:47:37.228005 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:47:37.227983 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mgrtw\"" Apr 22 18:47:37.235877 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:47:37.235856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gzjvx" Apr 22 18:47:37.352624 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:47:37.352484 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gzjvx"] Apr 22 18:47:37.355073 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:47:37.355042 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod111ee8c4_f2a7_4e7b_8faf_15392cc75774.slice/crio-46bea6f54f5b0f5a0db201d8532df43bd223c08d84b6901ee517200dd00fb33f WatchSource:0}: Error finding container 46bea6f54f5b0f5a0db201d8532df43bd223c08d84b6901ee517200dd00fb33f: Status 404 returned error can't find the container with id 46bea6f54f5b0f5a0db201d8532df43bd223c08d84b6901ee517200dd00fb33f Apr 22 18:47:38.093472 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:47:38.093442 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gzjvx" event={"ID":"111ee8c4-f2a7-4e7b-8faf-15392cc75774","Type":"ContainerStarted","Data":"46bea6f54f5b0f5a0db201d8532df43bd223c08d84b6901ee517200dd00fb33f"} Apr 22 18:47:39.096976 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:47:39.096939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gzjvx" event={"ID":"111ee8c4-f2a7-4e7b-8faf-15392cc75774","Type":"ContainerStarted","Data":"f6991f723cbd44b7479d8cd54a11672b408b3e74973d19343802522fbb283162"} Apr 22 18:47:39.096976 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:47:39.096980 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gzjvx" event={"ID":"111ee8c4-f2a7-4e7b-8faf-15392cc75774","Type":"ContainerStarted","Data":"4e0b27448a7551f31786db51479801061041d7ef5f08d404d8ed1671e1e47772"} Apr 22 18:47:39.116473 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:47:39.116424 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gzjvx" podStartSLOduration=253.098774497 podStartE2EDuration="4m14.116408093s" podCreationTimestamp="2026-04-22 18:43:25 +0000 UTC" firstStartedPulling="2026-04-22 18:47:37.356672782 +0000 UTC m=+252.558393628" lastFinishedPulling="2026-04-22 18:47:38.374306376 +0000 UTC m=+253.576027224" observedRunningTime="2026-04-22 18:47:39.114763865 +0000 UTC m=+254.316484744" watchObservedRunningTime="2026-04-22 18:47:39.116408093 +0000 UTC m=+254.318128976" Apr 22 18:48:25.201716 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:25.201689 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/ovn-acl-logging/0.log" Apr 22 18:48:25.203556 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:25.203535 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/ovn-acl-logging/0.log" Apr 22 18:48:25.205074 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:25.205056 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:48:49.662590 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.662555 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-ftc8z"] Apr 22 18:48:49.664913 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.662791 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a0225c7-98ca-4a53-925d-9a970a77218d" containerName="registry" Apr 22 18:48:49.664913 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.662803 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0225c7-98ca-4a53-925d-9a970a77218d" containerName="registry" Apr 22 18:48:49.664913 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.662874 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a0225c7-98ca-4a53-925d-9a970a77218d" containerName="registry" Apr 22 18:48:49.665786 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.665770 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-ftc8z" Apr 22 18:48:49.668141 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.668121 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 18:48:49.669014 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.668993 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 18:48:49.669120 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.669003 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-chf7x\"" Apr 22 18:48:49.675103 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.675078 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-ftc8z"] Apr 22 18:48:49.759382 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.759346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgxxk\" (UniqueName: \"kubernetes.io/projected/8352304e-0fd7-4e77-b4d5-73c48a6ed88f-kube-api-access-zgxxk\") pod \"cert-manager-cainjector-68b757865b-ftc8z\" (UID: \"8352304e-0fd7-4e77-b4d5-73c48a6ed88f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-ftc8z" Apr 22 18:48:49.759562 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.759404 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8352304e-0fd7-4e77-b4d5-73c48a6ed88f-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-ftc8z\" (UID: \"8352304e-0fd7-4e77-b4d5-73c48a6ed88f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-ftc8z" Apr 22 18:48:49.860675 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.860645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8352304e-0fd7-4e77-b4d5-73c48a6ed88f-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-ftc8z\" (UID: \"8352304e-0fd7-4e77-b4d5-73c48a6ed88f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-ftc8z" Apr 22 18:48:49.860823 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.860704 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgxxk\" (UniqueName: \"kubernetes.io/projected/8352304e-0fd7-4e77-b4d5-73c48a6ed88f-kube-api-access-zgxxk\") pod \"cert-manager-cainjector-68b757865b-ftc8z\" (UID: \"8352304e-0fd7-4e77-b4d5-73c48a6ed88f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-ftc8z" Apr 22 18:48:49.870821 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.870798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgxxk\" (UniqueName: \"kubernetes.io/projected/8352304e-0fd7-4e77-b4d5-73c48a6ed88f-kube-api-access-zgxxk\") pod \"cert-manager-cainjector-68b757865b-ftc8z\" (UID: \"8352304e-0fd7-4e77-b4d5-73c48a6ed88f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-ftc8z" Apr 22 18:48:49.871253 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.871237 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8352304e-0fd7-4e77-b4d5-73c48a6ed88f-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-ftc8z\" (UID: \"8352304e-0fd7-4e77-b4d5-73c48a6ed88f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-ftc8z" Apr 22 18:48:49.975102 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:49.975012 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-ftc8z" Apr 22 18:48:50.095482 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:50.095460 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-ftc8z"] Apr 22 18:48:50.097628 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:48:50.097601 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8352304e_0fd7_4e77_b4d5_73c48a6ed88f.slice/crio-b76fc338b543506912f402476772a6ab47911bbf1468cc0a6c788acad152aeb0 WatchSource:0}: Error finding container b76fc338b543506912f402476772a6ab47911bbf1468cc0a6c788acad152aeb0: Status 404 returned error can't find the container with id b76fc338b543506912f402476772a6ab47911bbf1468cc0a6c788acad152aeb0 Apr 22 18:48:50.099307 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:50.099290 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:48:50.273877 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:50.273787 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-ftc8z" event={"ID":"8352304e-0fd7-4e77-b4d5-73c48a6ed88f","Type":"ContainerStarted","Data":"b76fc338b543506912f402476772a6ab47911bbf1468cc0a6c788acad152aeb0"} Apr 22 18:48:54.289802 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:54.289763 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-ftc8z" event={"ID":"8352304e-0fd7-4e77-b4d5-73c48a6ed88f","Type":"ContainerStarted","Data":"935802fb89fd5cebcc10989a03154ced063285947f493acd8e5abd5709644716"} Apr 22 18:48:54.306170 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:48:54.306109 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-ftc8z" podStartSLOduration=2.111885653 podStartE2EDuration="5.306094526s" podCreationTimestamp="2026-04-22 18:48:49 +0000 UTC" firstStartedPulling="2026-04-22 18:48:50.099417707 +0000 UTC m=+325.301138554" lastFinishedPulling="2026-04-22 18:48:53.293626579 +0000 UTC m=+328.495347427" observedRunningTime="2026-04-22 18:48:54.304480402 +0000 UTC m=+329.506201271" watchObservedRunningTime="2026-04-22 18:48:54.306094526 +0000 UTC m=+329.507815391" Apr 22 18:49:00.230189 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.230140 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd"] Apr 22 18:49:00.233251 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.233222 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd" Apr 22 18:49:00.236908 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.236884 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-7qq4m\"" Apr 22 18:49:00.237028 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.236884 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:49:00.237028 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.236920 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:49:00.243191 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.243152 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd"] Apr 22 18:49:00.333466 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.333128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nt22\" (UniqueName: \"kubernetes.io/projected/5dfd09fd-008d-4de9-baec-d1f6850fa4ab-kube-api-access-2nt22\") pod \"openshift-lws-operator-bfc7f696d-crgmd\" (UID: \"5dfd09fd-008d-4de9-baec-d1f6850fa4ab\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd" Apr 22 18:49:00.333669 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.333513 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5dfd09fd-008d-4de9-baec-d1f6850fa4ab-tmp\") pod \"openshift-lws-operator-bfc7f696d-crgmd\" (UID: \"5dfd09fd-008d-4de9-baec-d1f6850fa4ab\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd" Apr 22 18:49:00.434345 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.434306 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nt22\" (UniqueName: \"kubernetes.io/projected/5dfd09fd-008d-4de9-baec-d1f6850fa4ab-kube-api-access-2nt22\") pod \"openshift-lws-operator-bfc7f696d-crgmd\" (UID: \"5dfd09fd-008d-4de9-baec-d1f6850fa4ab\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd" Apr 22 18:49:00.434345 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.434347 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5dfd09fd-008d-4de9-baec-d1f6850fa4ab-tmp\") pod \"openshift-lws-operator-bfc7f696d-crgmd\" (UID: \"5dfd09fd-008d-4de9-baec-d1f6850fa4ab\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd" Apr 22 18:49:00.434704 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.434689 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5dfd09fd-008d-4de9-baec-d1f6850fa4ab-tmp\") pod \"openshift-lws-operator-bfc7f696d-crgmd\" (UID: \"5dfd09fd-008d-4de9-baec-d1f6850fa4ab\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd" Apr 22 18:49:00.443154 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.443119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nt22\" (UniqueName: \"kubernetes.io/projected/5dfd09fd-008d-4de9-baec-d1f6850fa4ab-kube-api-access-2nt22\") pod \"openshift-lws-operator-bfc7f696d-crgmd\" (UID: \"5dfd09fd-008d-4de9-baec-d1f6850fa4ab\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd" Apr 22 18:49:00.541673 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.541635 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd" Apr 22 18:49:00.668345 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:00.668214 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd"] Apr 22 18:49:00.671140 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:49:00.671102 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dfd09fd_008d_4de9_baec_d1f6850fa4ab.slice/crio-c8b54195622b6fdcf48da5cd53c0b8522dfffed725926464764feb8a2d0a844c WatchSource:0}: Error finding container c8b54195622b6fdcf48da5cd53c0b8522dfffed725926464764feb8a2d0a844c: Status 404 returned error can't find the container with id c8b54195622b6fdcf48da5cd53c0b8522dfffed725926464764feb8a2d0a844c Apr 22 18:49:01.309420 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:01.309377 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd" event={"ID":"5dfd09fd-008d-4de9-baec-d1f6850fa4ab","Type":"ContainerStarted","Data":"c8b54195622b6fdcf48da5cd53c0b8522dfffed725926464764feb8a2d0a844c"} Apr 22 18:49:04.318467 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:04.318427 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd" event={"ID":"5dfd09fd-008d-4de9-baec-d1f6850fa4ab","Type":"ContainerStarted","Data":"c55f9743254125aa5b487dd1757e9a10513369e2f41d2117aed45243df35576e"} Apr 22 18:49:04.339440 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:04.339383 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-crgmd" podStartSLOduration=1.638478282 podStartE2EDuration="4.339364554s" podCreationTimestamp="2026-04-22 18:49:00 +0000 UTC" firstStartedPulling="2026-04-22 18:49:00.672507074 +0000 UTC m=+335.874227921" lastFinishedPulling="2026-04-22 18:49:03.373393345 +0000 UTC m=+338.575114193" observedRunningTime="2026-04-22 18:49:04.338727192 +0000 UTC m=+339.540448064" watchObservedRunningTime="2026-04-22 18:49:04.339364554 +0000 UTC m=+339.541085424" Apr 22 18:49:19.584741 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.584658 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb"] Apr 22 18:49:19.587623 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.587604 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:19.591300 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.591276 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 18:49:19.591421 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.591323 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 18:49:19.591421 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.591371 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 18:49:19.591532 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.591449 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 18:49:19.591649 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.591629 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-7nxpj\"" Apr 22 18:49:19.611476 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.611444 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb"] Apr 22 18:49:19.662886 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.662850 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/183fa1a7-6751-4d87-ade8-b37ca1c36061-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb\" (UID: \"183fa1a7-6751-4d87-ade8-b37ca1c36061\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:19.662886 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.662888 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6tzc\" (UniqueName: \"kubernetes.io/projected/183fa1a7-6751-4d87-ade8-b37ca1c36061-kube-api-access-q6tzc\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb\" (UID: \"183fa1a7-6751-4d87-ade8-b37ca1c36061\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:19.663113 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.662941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/183fa1a7-6751-4d87-ade8-b37ca1c36061-webhook-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb\" (UID: \"183fa1a7-6751-4d87-ade8-b37ca1c36061\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:19.763835 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.763728 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/183fa1a7-6751-4d87-ade8-b37ca1c36061-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb\" (UID: \"183fa1a7-6751-4d87-ade8-b37ca1c36061\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:19.764018 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.763866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6tzc\" (UniqueName: \"kubernetes.io/projected/183fa1a7-6751-4d87-ade8-b37ca1c36061-kube-api-access-q6tzc\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb\" (UID: \"183fa1a7-6751-4d87-ade8-b37ca1c36061\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:19.764018 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.763900 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/183fa1a7-6751-4d87-ade8-b37ca1c36061-webhook-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb\" (UID: \"183fa1a7-6751-4d87-ade8-b37ca1c36061\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:19.766275 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.766251 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/183fa1a7-6751-4d87-ade8-b37ca1c36061-webhook-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb\" (UID: \"183fa1a7-6751-4d87-ade8-b37ca1c36061\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:19.766419 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.766398 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/183fa1a7-6751-4d87-ade8-b37ca1c36061-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb\" (UID: \"183fa1a7-6751-4d87-ade8-b37ca1c36061\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:19.772360 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.772326 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6tzc\" (UniqueName: \"kubernetes.io/projected/183fa1a7-6751-4d87-ade8-b37ca1c36061-kube-api-access-q6tzc\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb\" (UID: \"183fa1a7-6751-4d87-ade8-b37ca1c36061\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:19.897793 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:19.897706 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:20.023876 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:20.023824 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb"] Apr 22 18:49:20.027366 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:49:20.027336 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183fa1a7_6751_4d87_ade8_b37ca1c36061.slice/crio-083a5c59178c7b05058e129b317ac00c693812e6324aa4cea503f973b319be09 WatchSource:0}: Error finding container 083a5c59178c7b05058e129b317ac00c693812e6324aa4cea503f973b319be09: Status 404 returned error can't find the container with id 083a5c59178c7b05058e129b317ac00c693812e6324aa4cea503f973b319be09 Apr 22 18:49:20.356405 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:20.356364 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" event={"ID":"183fa1a7-6751-4d87-ade8-b37ca1c36061","Type":"ContainerStarted","Data":"083a5c59178c7b05058e129b317ac00c693812e6324aa4cea503f973b319be09"} Apr 22 18:49:23.366678 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:23.366638 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" event={"ID":"183fa1a7-6751-4d87-ade8-b37ca1c36061","Type":"ContainerStarted","Data":"9e328ada1f6c9134d423b248f12dd945efe4373871ae9464044b02e24067a11b"} Apr 22 18:49:23.367073 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:23.366808 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:23.390434 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:23.390385 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" podStartSLOduration=1.845157331 podStartE2EDuration="4.39037094s" podCreationTimestamp="2026-04-22 18:49:19 +0000 UTC" firstStartedPulling="2026-04-22 18:49:20.028999425 +0000 UTC m=+355.230720272" lastFinishedPulling="2026-04-22 18:49:22.574213034 +0000 UTC m=+357.775933881" observedRunningTime="2026-04-22 18:49:23.388414293 +0000 UTC m=+358.590135168" watchObservedRunningTime="2026-04-22 18:49:23.39037094 +0000 UTC m=+358.592091809" Apr 22 18:49:34.371321 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:34.371291 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb" Apr 22 18:49:37.224035 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.223997 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql"] Apr 22 18:49:37.229278 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.229261 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" Apr 22 18:49:37.231706 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.231679 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 18:49:37.231877 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.231860 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 18:49:37.232549 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.232533 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:49:37.232613 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.232556 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:49:37.232613 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.232596 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-jt6pc\"" Apr 22 18:49:37.240396 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.240375 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql"] Apr 22 18:49:37.289870 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.289834 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf3f30b-595c-49c8-9153-e20421137452-tls-certs\") pod \"kube-auth-proxy-6bc9b7f4d-g27ql\" (UID: \"0cf3f30b-595c-49c8-9153-e20421137452\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" Apr 22 18:49:37.290056 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.289889 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0cf3f30b-595c-49c8-9153-e20421137452-tmp\") pod \"kube-auth-proxy-6bc9b7f4d-g27ql\" (UID: \"0cf3f30b-595c-49c8-9153-e20421137452\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" Apr 22 18:49:37.290056 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.289918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7l2d\" (UniqueName: \"kubernetes.io/projected/0cf3f30b-595c-49c8-9153-e20421137452-kube-api-access-b7l2d\") pod \"kube-auth-proxy-6bc9b7f4d-g27ql\" (UID: \"0cf3f30b-595c-49c8-9153-e20421137452\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" Apr 22 18:49:37.390464 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.390425 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0cf3f30b-595c-49c8-9153-e20421137452-tmp\") pod \"kube-auth-proxy-6bc9b7f4d-g27ql\" (UID: \"0cf3f30b-595c-49c8-9153-e20421137452\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" Apr 22 18:49:37.390464 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.390466 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7l2d\" (UniqueName: \"kubernetes.io/projected/0cf3f30b-595c-49c8-9153-e20421137452-kube-api-access-b7l2d\") pod \"kube-auth-proxy-6bc9b7f4d-g27ql\" (UID: \"0cf3f30b-595c-49c8-9153-e20421137452\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" Apr 22 18:49:37.390717 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.390515 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf3f30b-595c-49c8-9153-e20421137452-tls-certs\") pod \"kube-auth-proxy-6bc9b7f4d-g27ql\" (UID: \"0cf3f30b-595c-49c8-9153-e20421137452\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" Apr 22 18:49:37.392989 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.392957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0cf3f30b-595c-49c8-9153-e20421137452-tmp\") pod \"kube-auth-proxy-6bc9b7f4d-g27ql\" (UID: \"0cf3f30b-595c-49c8-9153-e20421137452\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" Apr 22 18:49:37.393104 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.393061 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0cf3f30b-595c-49c8-9153-e20421137452-tls-certs\") pod \"kube-auth-proxy-6bc9b7f4d-g27ql\" (UID: \"0cf3f30b-595c-49c8-9153-e20421137452\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" Apr 22 18:49:37.398837 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.398812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7l2d\" (UniqueName: \"kubernetes.io/projected/0cf3f30b-595c-49c8-9153-e20421137452-kube-api-access-b7l2d\") pod \"kube-auth-proxy-6bc9b7f4d-g27ql\" (UID: \"0cf3f30b-595c-49c8-9153-e20421137452\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" Apr 22 18:49:37.537894 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.537858 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" Apr 22 18:49:37.677400 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:37.677375 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql"] Apr 22 18:49:37.680070 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:49:37.680042 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cf3f30b_595c_49c8_9153_e20421137452.slice/crio-ecae48cd4eae67da006decf0ec3668b7cbbc7c84e15b3edd32b9d8f930f6d117 WatchSource:0}: Error finding container ecae48cd4eae67da006decf0ec3668b7cbbc7c84e15b3edd32b9d8f930f6d117: Status 404 returned error can't find the container with id ecae48cd4eae67da006decf0ec3668b7cbbc7c84e15b3edd32b9d8f930f6d117 Apr 22 18:49:38.405933 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:38.405879 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" event={"ID":"0cf3f30b-595c-49c8-9153-e20421137452","Type":"ContainerStarted","Data":"ecae48cd4eae67da006decf0ec3668b7cbbc7c84e15b3edd32b9d8f930f6d117"} Apr 22 18:49:40.367249 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:40.367217 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-fhwzp"] Apr 22 18:49:40.369027 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:40.369012 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" Apr 22 18:49:40.372017 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:40.371994 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-v9tnm\"" Apr 22 18:49:40.372017 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:40.372009 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 22 18:49:40.381259 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:40.381238 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-fhwzp"] Apr 22 18:49:40.414329 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:40.414295 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x85d2\" (UniqueName: \"kubernetes.io/projected/91a1a790-afb3-4384-8461-1d43505b3147-kube-api-access-x85d2\") pod \"odh-model-controller-858dbf95b8-fhwzp\" (UID: \"91a1a790-afb3-4384-8461-1d43505b3147\") " pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" Apr 22 18:49:40.414497 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:40.414345 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91a1a790-afb3-4384-8461-1d43505b3147-cert\") pod \"odh-model-controller-858dbf95b8-fhwzp\" (UID: \"91a1a790-afb3-4384-8461-1d43505b3147\") " pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" Apr 22 18:49:40.515469 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:40.515420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91a1a790-afb3-4384-8461-1d43505b3147-cert\") pod \"odh-model-controller-858dbf95b8-fhwzp\" (UID: \"91a1a790-afb3-4384-8461-1d43505b3147\") " pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" Apr 22 18:49:40.515647 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:40.515510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x85d2\" (UniqueName: \"kubernetes.io/projected/91a1a790-afb3-4384-8461-1d43505b3147-kube-api-access-x85d2\") pod \"odh-model-controller-858dbf95b8-fhwzp\" (UID: \"91a1a790-afb3-4384-8461-1d43505b3147\") " pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" Apr 22 18:49:40.515647 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:49:40.515586 2578 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 18:49:40.515730 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:49:40.515673 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91a1a790-afb3-4384-8461-1d43505b3147-cert podName:91a1a790-afb3-4384-8461-1d43505b3147 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:41.015654067 +0000 UTC m=+376.217374923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91a1a790-afb3-4384-8461-1d43505b3147-cert") pod "odh-model-controller-858dbf95b8-fhwzp" (UID: "91a1a790-afb3-4384-8461-1d43505b3147") : secret "odh-model-controller-webhook-cert" not found Apr 22 18:49:40.524928 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:40.524898 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x85d2\" (UniqueName: \"kubernetes.io/projected/91a1a790-afb3-4384-8461-1d43505b3147-kube-api-access-x85d2\") pod \"odh-model-controller-858dbf95b8-fhwzp\" (UID: \"91a1a790-afb3-4384-8461-1d43505b3147\") " pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" Apr 22 18:49:41.019692 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:41.019653 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91a1a790-afb3-4384-8461-1d43505b3147-cert\") pod \"odh-model-controller-858dbf95b8-fhwzp\" (UID: \"91a1a790-afb3-4384-8461-1d43505b3147\") " pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" Apr 22 18:49:41.019898 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:49:41.019823 2578 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 18:49:41.019948 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:49:41.019916 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91a1a790-afb3-4384-8461-1d43505b3147-cert podName:91a1a790-afb3-4384-8461-1d43505b3147 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:42.019892878 +0000 UTC m=+377.221613742 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/91a1a790-afb3-4384-8461-1d43505b3147-cert") pod "odh-model-controller-858dbf95b8-fhwzp" (UID: "91a1a790-afb3-4384-8461-1d43505b3147") : secret "odh-model-controller-webhook-cert" not found Apr 22 18:49:41.415611 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:41.415580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" event={"ID":"0cf3f30b-595c-49c8-9153-e20421137452","Type":"ContainerStarted","Data":"3fafcaedd4c50c813ff3d9348eccbf4d701ad1ab9031f8854bb0193b9ad0ea90"} Apr 22 18:49:41.433716 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:41.433667 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-g27ql" podStartSLOduration=0.956939998 podStartE2EDuration="4.433654598s" podCreationTimestamp="2026-04-22 18:49:37 +0000 UTC" firstStartedPulling="2026-04-22 18:49:37.681834273 +0000 UTC m=+372.883555136" lastFinishedPulling="2026-04-22 18:49:41.158548872 +0000 UTC m=+376.360269736" observedRunningTime="2026-04-22 18:49:41.431780035 +0000 UTC m=+376.633500903" watchObservedRunningTime="2026-04-22 18:49:41.433654598 +0000 UTC m=+376.635375469" Apr 22 18:49:42.027581 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:42.027542 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91a1a790-afb3-4384-8461-1d43505b3147-cert\") pod \"odh-model-controller-858dbf95b8-fhwzp\" (UID: \"91a1a790-afb3-4384-8461-1d43505b3147\") " pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" Apr 22 18:49:42.030046 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:42.030025 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91a1a790-afb3-4384-8461-1d43505b3147-cert\") pod \"odh-model-controller-858dbf95b8-fhwzp\" (UID: \"91a1a790-afb3-4384-8461-1d43505b3147\") " pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" Apr 22 18:49:42.179515 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:42.179464 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" Apr 22 18:49:42.326353 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:42.326280 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-fhwzp"] Apr 22 18:49:42.329415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:49:42.329391 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91a1a790_afb3_4384_8461_1d43505b3147.slice/crio-a63060c7917e6ef8af68b5af483180ae8a2737a00c1f6f4388098bbd59ee61fa WatchSource:0}: Error finding container a63060c7917e6ef8af68b5af483180ae8a2737a00c1f6f4388098bbd59ee61fa: Status 404 returned error can't find the container with id a63060c7917e6ef8af68b5af483180ae8a2737a00c1f6f4388098bbd59ee61fa Apr 22 18:49:42.419381 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:42.419338 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" event={"ID":"91a1a790-afb3-4384-8461-1d43505b3147","Type":"ContainerStarted","Data":"a63060c7917e6ef8af68b5af483180ae8a2737a00c1f6f4388098bbd59ee61fa"} Apr 22 18:49:46.431427 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.431338 2578 generic.go:358] "Generic (PLEG): container finished" podID="91a1a790-afb3-4384-8461-1d43505b3147" containerID="faa2ec7a74fa0a6f8872a7209eaa9cd04cbf871fa6cd94753b78fa2c9607ce90" exitCode=1 Apr 22 18:49:46.431427 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.431382 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" event={"ID":"91a1a790-afb3-4384-8461-1d43505b3147","Type":"ContainerDied","Data":"faa2ec7a74fa0a6f8872a7209eaa9cd04cbf871fa6cd94753b78fa2c9607ce90"} Apr 22 18:49:46.431816 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.431565 2578 scope.go:117] "RemoveContainer" containerID="faa2ec7a74fa0a6f8872a7209eaa9cd04cbf871fa6cd94753b78fa2c9607ce90" Apr 22 18:49:46.601403 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.601370 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-gpl7n"] Apr 22 18:49:46.603325 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.603310 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" Apr 22 18:49:46.609129 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.609099 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 22 18:49:46.611133 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.611111 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-smw9j\"" Apr 22 18:49:46.621542 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.621518 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-gpl7n"] Apr 22 18:49:46.667967 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.667941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l42vf\" (UniqueName: \"kubernetes.io/projected/6a58b8a6-00ae-4197-8abe-ea940d8dc631-kube-api-access-l42vf\") pod \"kserve-controller-manager-856948b99f-gpl7n\" (UID: \"6a58b8a6-00ae-4197-8abe-ea940d8dc631\") " pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" Apr 22 18:49:46.668092 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.667987 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a58b8a6-00ae-4197-8abe-ea940d8dc631-cert\") pod \"kserve-controller-manager-856948b99f-gpl7n\" (UID: \"6a58b8a6-00ae-4197-8abe-ea940d8dc631\") " pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" Apr 22 18:49:46.768357 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.768332 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l42vf\" (UniqueName: \"kubernetes.io/projected/6a58b8a6-00ae-4197-8abe-ea940d8dc631-kube-api-access-l42vf\") pod \"kserve-controller-manager-856948b99f-gpl7n\" (UID: \"6a58b8a6-00ae-4197-8abe-ea940d8dc631\") " pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" Apr 22 18:49:46.768515 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.768390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a58b8a6-00ae-4197-8abe-ea940d8dc631-cert\") pod \"kserve-controller-manager-856948b99f-gpl7n\" (UID: \"6a58b8a6-00ae-4197-8abe-ea940d8dc631\") " pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" Apr 22 18:49:46.768579 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:49:46.768527 2578 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 18:49:46.768639 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:49:46.768593 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a58b8a6-00ae-4197-8abe-ea940d8dc631-cert podName:6a58b8a6-00ae-4197-8abe-ea940d8dc631 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:47.26857152 +0000 UTC m=+382.470292380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6a58b8a6-00ae-4197-8abe-ea940d8dc631-cert") pod "kserve-controller-manager-856948b99f-gpl7n" (UID: "6a58b8a6-00ae-4197-8abe-ea940d8dc631") : secret "kserve-webhook-server-cert" not found Apr 22 18:49:46.784644 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:46.784609 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l42vf\" (UniqueName: \"kubernetes.io/projected/6a58b8a6-00ae-4197-8abe-ea940d8dc631-kube-api-access-l42vf\") pod \"kserve-controller-manager-856948b99f-gpl7n\" (UID: \"6a58b8a6-00ae-4197-8abe-ea940d8dc631\") " pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" Apr 22 18:49:47.271991 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:47.271952 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a58b8a6-00ae-4197-8abe-ea940d8dc631-cert\") pod \"kserve-controller-manager-856948b99f-gpl7n\" (UID: \"6a58b8a6-00ae-4197-8abe-ea940d8dc631\") " pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" Apr 22 18:49:47.274510 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:47.274483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a58b8a6-00ae-4197-8abe-ea940d8dc631-cert\") pod \"kserve-controller-manager-856948b99f-gpl7n\" (UID: \"6a58b8a6-00ae-4197-8abe-ea940d8dc631\") " pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" Apr 22 18:49:47.435863 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:47.435823 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" event={"ID":"91a1a790-afb3-4384-8461-1d43505b3147","Type":"ContainerStarted","Data":"6f2112a1f180e4f15087f29f58264aae9a8f3800d0f60cbb5eb0830f501825ae"} Apr 22 18:49:47.436258 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:47.435937 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" Apr 22 18:49:47.457596 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:47.457554 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" podStartSLOduration=3.077300281 podStartE2EDuration="7.457542169s" podCreationTimestamp="2026-04-22 18:49:40 +0000 UTC" firstStartedPulling="2026-04-22 18:49:42.330752555 +0000 UTC m=+377.532473401" lastFinishedPulling="2026-04-22 18:49:46.710994442 +0000 UTC m=+381.912715289" observedRunningTime="2026-04-22 18:49:47.456037819 +0000 UTC m=+382.657758687" watchObservedRunningTime="2026-04-22 18:49:47.457542169 +0000 UTC m=+382.659263037" Apr 22 18:49:47.513594 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:47.513566 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" Apr 22 18:49:47.638866 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:47.638842 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-gpl7n"] Apr 22 18:49:47.641223 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:49:47.641194 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a58b8a6_00ae_4197_8abe_ea940d8dc631.slice/crio-3776dc36e57a2425818808226b7e982fce5778d0648e85db700b7c5f89d88cf3 WatchSource:0}: Error finding container 3776dc36e57a2425818808226b7e982fce5778d0648e85db700b7c5f89d88cf3: Status 404 returned error can't find the container with id 3776dc36e57a2425818808226b7e982fce5778d0648e85db700b7c5f89d88cf3 Apr 22 18:49:48.439953 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:48.439919 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" event={"ID":"6a58b8a6-00ae-4197-8abe-ea940d8dc631","Type":"ContainerStarted","Data":"3776dc36e57a2425818808226b7e982fce5778d0648e85db700b7c5f89d88cf3"} Apr 22 18:49:51.450324 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:51.450286 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" event={"ID":"6a58b8a6-00ae-4197-8abe-ea940d8dc631","Type":"ContainerStarted","Data":"caee619a952052a8dd8421e3e5cddaad318d94929b542173ef75297a597142cd"} Apr 22 18:49:51.450974 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:51.450421 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" Apr 22 18:49:51.519058 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:51.519006 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" podStartSLOduration=2.5894864440000003 podStartE2EDuration="5.518991742s" podCreationTimestamp="2026-04-22 18:49:46 +0000 UTC" firstStartedPulling="2026-04-22 18:49:47.642514522 +0000 UTC m=+382.844235368" lastFinishedPulling="2026-04-22 18:49:50.572019816 +0000 UTC m=+385.773740666" observedRunningTime="2026-04-22 18:49:51.51851112 +0000 UTC m=+386.720231991" watchObservedRunningTime="2026-04-22 18:49:51.518991742 +0000 UTC m=+386.720712610" Apr 22 18:49:52.968512 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:52.968482 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv"] Apr 22 18:49:52.970416 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:52.970400 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" Apr 22 18:49:52.975134 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:52.975082 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 18:49:52.975283 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:52.975199 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 18:49:52.975283 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:52.975258 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-w7zmd\"" Apr 22 18:49:52.993944 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:52.993922 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv"] Apr 22 18:49:53.122484 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:53.122455 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kttcm\" (UniqueName: \"kubernetes.io/projected/a393e6e5-0f56-4fe5-bdd2-4d2f355498cc-kube-api-access-kttcm\") pod \"servicemesh-operator3-55f49c5f94-nqtvv\" (UID: \"a393e6e5-0f56-4fe5-bdd2-4d2f355498cc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" Apr 22 18:49:53.122651 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:53.122493 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a393e6e5-0f56-4fe5-bdd2-4d2f355498cc-operator-config\") pod \"servicemesh-operator3-55f49c5f94-nqtvv\" (UID: \"a393e6e5-0f56-4fe5-bdd2-4d2f355498cc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" Apr 22 18:49:53.223439 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:53.223369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kttcm\" (UniqueName: \"kubernetes.io/projected/a393e6e5-0f56-4fe5-bdd2-4d2f355498cc-kube-api-access-kttcm\") pod \"servicemesh-operator3-55f49c5f94-nqtvv\" (UID: \"a393e6e5-0f56-4fe5-bdd2-4d2f355498cc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" Apr 22 18:49:53.223439 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:53.223408 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a393e6e5-0f56-4fe5-bdd2-4d2f355498cc-operator-config\") pod \"servicemesh-operator3-55f49c5f94-nqtvv\" (UID: \"a393e6e5-0f56-4fe5-bdd2-4d2f355498cc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" Apr 22 18:49:53.225984 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:53.225954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a393e6e5-0f56-4fe5-bdd2-4d2f355498cc-operator-config\") pod \"servicemesh-operator3-55f49c5f94-nqtvv\" (UID: \"a393e6e5-0f56-4fe5-bdd2-4d2f355498cc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" Apr 22 18:49:53.240137 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:53.240115 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kttcm\" (UniqueName: \"kubernetes.io/projected/a393e6e5-0f56-4fe5-bdd2-4d2f355498cc-kube-api-access-kttcm\") pod \"servicemesh-operator3-55f49c5f94-nqtvv\" (UID: \"a393e6e5-0f56-4fe5-bdd2-4d2f355498cc\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" Apr 22 18:49:53.279061 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:53.279035 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" Apr 22 18:49:53.414500 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:53.414450 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv"] Apr 22 18:49:53.417251 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:49:53.416243 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda393e6e5_0f56_4fe5_bdd2_4d2f355498cc.slice/crio-116aead476e2e1f7237268aa170a93c6bf6a4e5f86a4ccdb06b4b04664b4fc53 WatchSource:0}: Error finding container 116aead476e2e1f7237268aa170a93c6bf6a4e5f86a4ccdb06b4b04664b4fc53: Status 404 returned error can't find the container with id 116aead476e2e1f7237268aa170a93c6bf6a4e5f86a4ccdb06b4b04664b4fc53 Apr 22 18:49:53.457675 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:53.457651 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" event={"ID":"a393e6e5-0f56-4fe5-bdd2-4d2f355498cc","Type":"ContainerStarted","Data":"116aead476e2e1f7237268aa170a93c6bf6a4e5f86a4ccdb06b4b04664b4fc53"} Apr 22 18:49:56.470124 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:56.470034 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" event={"ID":"a393e6e5-0f56-4fe5-bdd2-4d2f355498cc","Type":"ContainerStarted","Data":"cd076738c5323157a920ad5356be77cc2f9aa9d2097d7f4660aa88d5b18588a8"} Apr 22 18:49:56.470566 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:56.470272 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" Apr 22 18:49:56.494389 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:56.494336 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" podStartSLOduration=1.831748425 podStartE2EDuration="4.494320492s" podCreationTimestamp="2026-04-22 18:49:52 +0000 UTC" firstStartedPulling="2026-04-22 18:49:53.419920654 +0000 UTC m=+388.621641502" lastFinishedPulling="2026-04-22 18:49:56.082492709 +0000 UTC m=+391.284213569" observedRunningTime="2026-04-22 18:49:56.493170202 +0000 UTC m=+391.694891071" watchObservedRunningTime="2026-04-22 18:49:56.494320492 +0000 UTC m=+391.696041361" Apr 22 18:49:58.442095 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:49:58.442066 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-fhwzp" Apr 22 18:50:07.475944 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:07.475914 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-nqtvv" Apr 22 18:50:22.459170 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:22.459131 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-gpl7n" Apr 22 18:50:37.815121 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:37.815087 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t42pb"] Apr 22 18:50:37.828674 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:37.828649 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-t42pb" Apr 22 18:50:37.832721 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:37.832687 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:50:37.833352 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:37.833332 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-bltkv\"" Apr 22 18:50:37.834255 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:37.834235 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:50:37.837395 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:37.837371 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t42pb"] Apr 22 18:50:37.846480 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:37.846458 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76lts\" (UniqueName: \"kubernetes.io/projected/740f2297-d882-42ed-87cd-c4d5fe1e52be-kube-api-access-76lts\") pod \"kuadrant-operator-catalog-t42pb\" (UID: \"740f2297-d882-42ed-87cd-c4d5fe1e52be\") " pod="kuadrant-system/kuadrant-operator-catalog-t42pb" Apr 22 18:50:37.947614 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:37.947575 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76lts\" (UniqueName: \"kubernetes.io/projected/740f2297-d882-42ed-87cd-c4d5fe1e52be-kube-api-access-76lts\") pod \"kuadrant-operator-catalog-t42pb\" (UID: \"740f2297-d882-42ed-87cd-c4d5fe1e52be\") " pod="kuadrant-system/kuadrant-operator-catalog-t42pb" Apr 22 18:50:37.956571 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:37.956548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76lts\" (UniqueName: \"kubernetes.io/projected/740f2297-d882-42ed-87cd-c4d5fe1e52be-kube-api-access-76lts\") pod \"kuadrant-operator-catalog-t42pb\" (UID: \"740f2297-d882-42ed-87cd-c4d5fe1e52be\") " pod="kuadrant-system/kuadrant-operator-catalog-t42pb" Apr 22 18:50:38.117755 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.117674 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t42pb"] Apr 22 18:50:38.117912 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.117898 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-t42pb" Apr 22 18:50:38.201851 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.200481 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x"] Apr 22 18:50:38.208268 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.207972 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.211605 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.211577 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 18:50:38.212006 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.211749 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-jv9xd\"" Apr 22 18:50:38.212006 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.211599 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 18:50:38.212006 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.211908 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 18:50:38.212006 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.211915 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 18:50:38.220161 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.220122 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x"] Apr 22 18:50:38.249997 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.249966 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/32280c0e-a9ce-42fb-85f9-29868b32364e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.249997 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.250000 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/32280c0e-a9ce-42fb-85f9-29868b32364e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.250215 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.250090 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/32280c0e-a9ce-42fb-85f9-29868b32364e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.250215 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.250137 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/32280c0e-a9ce-42fb-85f9-29868b32364e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.250288 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.250245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/32280c0e-a9ce-42fb-85f9-29868b32364e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.250321 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.250301 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/32280c0e-a9ce-42fb-85f9-29868b32364e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.250354 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.250336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkv8l\" (UniqueName: \"kubernetes.io/projected/32280c0e-a9ce-42fb-85f9-29868b32364e-kube-api-access-jkv8l\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.272744 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.272716 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t42pb"] Apr 22 18:50:38.275285 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:50:38.275256 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod740f2297_d882_42ed_87cd_c4d5fe1e52be.slice/crio-3902465b6a3a6a90776d19cf0c39450bbd0b0aa99e406d41ef86df69e4f83950 WatchSource:0}: Error finding container 3902465b6a3a6a90776d19cf0c39450bbd0b0aa99e406d41ef86df69e4f83950: Status 404 returned error can't find the container with id 3902465b6a3a6a90776d19cf0c39450bbd0b0aa99e406d41ef86df69e4f83950 Apr 22 18:50:38.353846 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.353800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/32280c0e-a9ce-42fb-85f9-29868b32364e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.353846 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.353842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkv8l\" (UniqueName: \"kubernetes.io/projected/32280c0e-a9ce-42fb-85f9-29868b32364e-kube-api-access-jkv8l\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.354091 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.353867 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/32280c0e-a9ce-42fb-85f9-29868b32364e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.354091 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.353884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/32280c0e-a9ce-42fb-85f9-29868b32364e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.354091 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.353910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/32280c0e-a9ce-42fb-85f9-29868b32364e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.354091 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.353926 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/32280c0e-a9ce-42fb-85f9-29868b32364e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.354091 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.353974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/32280c0e-a9ce-42fb-85f9-29868b32364e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.354933 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.354894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/32280c0e-a9ce-42fb-85f9-29868b32364e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.356664 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.356633 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/32280c0e-a9ce-42fb-85f9-29868b32364e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.356843 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.356816 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/32280c0e-a9ce-42fb-85f9-29868b32364e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.356843 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.356834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/32280c0e-a9ce-42fb-85f9-29868b32364e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.357104 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.357086 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/32280c0e-a9ce-42fb-85f9-29868b32364e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.362153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.362134 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/32280c0e-a9ce-42fb-85f9-29868b32364e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.362563 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.362545 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkv8l\" (UniqueName: \"kubernetes.io/projected/32280c0e-a9ce-42fb-85f9-29868b32364e-kube-api-access-jkv8l\") pod \"istiod-openshift-gateway-55ff986f96-5hk6x\" (UID: \"32280c0e-a9ce-42fb-85f9-29868b32364e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.521365 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.521273 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:38.604605 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.604553 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-t42pb" event={"ID":"740f2297-d882-42ed-87cd-c4d5fe1e52be","Type":"ContainerStarted","Data":"3902465b6a3a6a90776d19cf0c39450bbd0b0aa99e406d41ef86df69e4f83950"} Apr 22 18:50:38.666783 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:38.666609 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x"] Apr 22 18:50:38.669224 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:50:38.669193 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32280c0e_a9ce_42fb_85f9_29868b32364e.slice/crio-ad7ad34279b22af9cb3695d8f441e6e2b59888028d9fff3ba561ad43889aca99 WatchSource:0}: Error finding container ad7ad34279b22af9cb3695d8f441e6e2b59888028d9fff3ba561ad43889aca99: Status 404 returned error can't find the container with id ad7ad34279b22af9cb3695d8f441e6e2b59888028d9fff3ba561ad43889aca99 Apr 22 18:50:39.610495 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:39.610448 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" event={"ID":"32280c0e-a9ce-42fb-85f9-29868b32364e","Type":"ContainerStarted","Data":"ad7ad34279b22af9cb3695d8f441e6e2b59888028d9fff3ba561ad43889aca99"} Apr 22 18:50:41.594934 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:41.594899 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 22 18:50:41.595301 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:41.594975 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 22 18:50:42.622820 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:42.622773 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-t42pb" event={"ID":"740f2297-d882-42ed-87cd-c4d5fe1e52be","Type":"ContainerStarted","Data":"c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463"} Apr 22 18:50:42.623330 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:42.622846 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-t42pb" podUID="740f2297-d882-42ed-87cd-c4d5fe1e52be" containerName="registry-server" containerID="cri-o://c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463" gracePeriod=2 Apr 22 18:50:42.624617 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:42.624591 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" event={"ID":"32280c0e-a9ce-42fb-85f9-29868b32364e","Type":"ContainerStarted","Data":"cdb42a2223797083286f970647e2465895a9e2ee1527237742f80455913cb902"} Apr 22 18:50:42.624750 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:42.624712 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:42.626775 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:42.626750 2578 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-5hk6x container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 18:50:42.626871 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:42.626800 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" podUID="32280c0e-a9ce-42fb-85f9-29868b32364e" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:50:42.651708 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:42.651648 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-t42pb" podStartSLOduration=2.337196728 podStartE2EDuration="5.651630952s" podCreationTimestamp="2026-04-22 18:50:37 +0000 UTC" firstStartedPulling="2026-04-22 18:50:38.276603696 +0000 UTC m=+433.478324544" lastFinishedPulling="2026-04-22 18:50:41.591037904 +0000 UTC m=+436.792758768" observedRunningTime="2026-04-22 18:50:42.650894545 +0000 UTC m=+437.852615428" watchObservedRunningTime="2026-04-22 18:50:42.651630952 +0000 UTC m=+437.853351824" Apr 22 18:50:42.672171 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:42.671589 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" podStartSLOduration=1.747951195 podStartE2EDuration="4.671568394s" podCreationTimestamp="2026-04-22 18:50:38 +0000 UTC" firstStartedPulling="2026-04-22 18:50:38.671116542 +0000 UTC m=+433.872837390" lastFinishedPulling="2026-04-22 18:50:41.594733742 +0000 UTC m=+436.796454589" observedRunningTime="2026-04-22 18:50:42.67063904 +0000 UTC m=+437.872359908" watchObservedRunningTime="2026-04-22 18:50:42.671568394 +0000 UTC m=+437.873289267" Apr 22 18:50:42.856139 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:42.856117 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-t42pb" Apr 22 18:50:42.898167 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:42.898081 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76lts\" (UniqueName: \"kubernetes.io/projected/740f2297-d882-42ed-87cd-c4d5fe1e52be-kube-api-access-76lts\") pod \"740f2297-d882-42ed-87cd-c4d5fe1e52be\" (UID: \"740f2297-d882-42ed-87cd-c4d5fe1e52be\") " Apr 22 18:50:42.900360 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:42.900331 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740f2297-d882-42ed-87cd-c4d5fe1e52be-kube-api-access-76lts" (OuterVolumeSpecName: "kube-api-access-76lts") pod "740f2297-d882-42ed-87cd-c4d5fe1e52be" (UID: "740f2297-d882-42ed-87cd-c4d5fe1e52be"). InnerVolumeSpecName "kube-api-access-76lts". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:42.999012 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:42.998962 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-76lts\" (UniqueName: \"kubernetes.io/projected/740f2297-d882-42ed-87cd-c4d5fe1e52be-kube-api-access-76lts\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 22 18:50:43.629461 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:43.629379 2578 generic.go:358] "Generic (PLEG): container finished" podID="740f2297-d882-42ed-87cd-c4d5fe1e52be" containerID="c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463" exitCode=0 Apr 22 18:50:43.629862 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:43.629459 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-t42pb" Apr 22 18:50:43.629862 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:43.629463 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-t42pb" event={"ID":"740f2297-d882-42ed-87cd-c4d5fe1e52be","Type":"ContainerDied","Data":"c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463"} Apr 22 18:50:43.629862 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:43.629499 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-t42pb" event={"ID":"740f2297-d882-42ed-87cd-c4d5fe1e52be","Type":"ContainerDied","Data":"3902465b6a3a6a90776d19cf0c39450bbd0b0aa99e406d41ef86df69e4f83950"} Apr 22 18:50:43.629862 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:43.629518 2578 scope.go:117] "RemoveContainer" containerID="c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463" Apr 22 18:50:43.630681 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:43.630456 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-5hk6x" Apr 22 18:50:43.638310 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:43.638292 2578 scope.go:117] "RemoveContainer" containerID="c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463" Apr 22 18:50:43.638577 ip-10-0-135-106 kubenswrapper[2578]: E0422 18:50:43.638561 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463\": container with ID starting with c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463 not found: ID does not exist" containerID="c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463" Apr 22 18:50:43.638625 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:43.638588 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463"} err="failed to get container status \"c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463\": rpc error: code = NotFound desc = could not find container \"c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463\": container with ID starting with c09a7203b263e03f1cff077e28eb2873375d01c783e31e105f0e88575ae44463 not found: ID does not exist" Apr 22 18:50:43.644488 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:43.644465 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t42pb"] Apr 22 18:50:43.648833 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:43.648812 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t42pb"] Apr 22 18:50:45.333638 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:50:45.333600 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740f2297-d882-42ed-87cd-c4d5fe1e52be" path="/var/lib/kubelet/pods/740f2297-d882-42ed-87cd-c4d5fe1e52be/volumes" Apr 22 18:51:30.378643 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.378598 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-xnl9d"] Apr 22 18:51:30.379216 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.378975 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="740f2297-d882-42ed-87cd-c4d5fe1e52be" containerName="registry-server" Apr 22 18:51:30.379216 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.378992 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="740f2297-d882-42ed-87cd-c4d5fe1e52be" containerName="registry-server" Apr 22 18:51:30.379216 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.379101 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="740f2297-d882-42ed-87cd-c4d5fe1e52be" containerName="registry-server" Apr 22 18:51:30.388644 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.388625 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-xnl9d" Apr 22 18:51:30.391407 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.391385 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:51:30.392265 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.392248 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:51:30.392352 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.392273 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-fzglc\"" Apr 22 18:51:30.395383 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.395360 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-xnl9d"] Apr 22 18:51:30.566651 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.566615 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv5kl\" (UniqueName: \"kubernetes.io/projected/4601a6f4-2fcc-4ce9-a605-829d49dfe318-kube-api-access-kv5kl\") pod \"authorino-operator-657f44b778-xnl9d\" (UID: \"4601a6f4-2fcc-4ce9-a605-829d49dfe318\") " pod="kuadrant-system/authorino-operator-657f44b778-xnl9d" Apr 22 18:51:30.667877 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.667789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kv5kl\" (UniqueName: \"kubernetes.io/projected/4601a6f4-2fcc-4ce9-a605-829d49dfe318-kube-api-access-kv5kl\") pod \"authorino-operator-657f44b778-xnl9d\" (UID: \"4601a6f4-2fcc-4ce9-a605-829d49dfe318\") " pod="kuadrant-system/authorino-operator-657f44b778-xnl9d" Apr 22 18:51:30.697377 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.697342 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv5kl\" (UniqueName: \"kubernetes.io/projected/4601a6f4-2fcc-4ce9-a605-829d49dfe318-kube-api-access-kv5kl\") pod \"authorino-operator-657f44b778-xnl9d\" (UID: \"4601a6f4-2fcc-4ce9-a605-829d49dfe318\") " pod="kuadrant-system/authorino-operator-657f44b778-xnl9d" Apr 22 18:51:30.699081 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.699060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-xnl9d" Apr 22 18:51:30.836837 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:30.836810 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-xnl9d"] Apr 22 18:51:30.839486 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:51:30.839459 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4601a6f4_2fcc_4ce9_a605_829d49dfe318.slice/crio-740031aa7d32541056e7a6af77362e253eb046d780d40dde2eeb087975846789 WatchSource:0}: Error finding container 740031aa7d32541056e7a6af77362e253eb046d780d40dde2eeb087975846789: Status 404 returned error can't find the container with id 740031aa7d32541056e7a6af77362e253eb046d780d40dde2eeb087975846789 Apr 22 18:51:31.780669 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:31.780633 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-xnl9d" event={"ID":"4601a6f4-2fcc-4ce9-a605-829d49dfe318","Type":"ContainerStarted","Data":"740031aa7d32541056e7a6af77362e253eb046d780d40dde2eeb087975846789"} Apr 22 18:51:32.785786 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:32.785736 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-xnl9d" event={"ID":"4601a6f4-2fcc-4ce9-a605-829d49dfe318","Type":"ContainerStarted","Data":"62ea584266a8fdc41fe441a663d5d5e833a588add3f9c760ff5a945c9774f823"} Apr 22 18:51:32.786211 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:32.785925 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-xnl9d" Apr 22 18:51:32.813915 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:32.813854 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-xnl9d" podStartSLOduration=1.304017832 podStartE2EDuration="2.813838409s" podCreationTimestamp="2026-04-22 18:51:30 +0000 UTC" firstStartedPulling="2026-04-22 18:51:30.841474644 +0000 UTC m=+486.043195491" lastFinishedPulling="2026-04-22 18:51:32.351295217 +0000 UTC m=+487.553016068" observedRunningTime="2026-04-22 18:51:32.812648047 +0000 UTC m=+488.014368917" watchObservedRunningTime="2026-04-22 18:51:32.813838409 +0000 UTC m=+488.015559281" Apr 22 18:51:43.790732 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:51:43.790691 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-xnl9d" Apr 22 18:52:43.799376 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.799292 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7"] Apr 22 18:52:43.802537 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.802498 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:43.804935 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.804913 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-j6nk7\"" Apr 22 18:52:43.821381 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.821350 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7"] Apr 22 18:52:43.908084 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.908044 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:43.908084 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.908084 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:43.908356 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.908107 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7a7aa15c-54fd-4ea3-a092-c9a349742470-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:43.908356 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.908126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:43.908356 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.908215 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:43.908356 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.908267 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7a7aa15c-54fd-4ea3-a092-c9a349742470-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:43.908356 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.908326 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cndp5\" (UniqueName: \"kubernetes.io/projected/7a7aa15c-54fd-4ea3-a092-c9a349742470-kube-api-access-cndp5\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:43.908356 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.908348 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:43.908653 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:43.908375 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7a7aa15c-54fd-4ea3-a092-c9a349742470-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.009626 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.009594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7a7aa15c-54fd-4ea3-a092-c9a349742470-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.009801 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.009632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.009801 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.009672 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.009906 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.009799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7a7aa15c-54fd-4ea3-a092-c9a349742470-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.009906 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.009854 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.009906 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.009884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.010111 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.009939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7a7aa15c-54fd-4ea3-a092-c9a349742470-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.010111 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.009992 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cndp5\" (UniqueName: \"kubernetes.io/projected/7a7aa15c-54fd-4ea3-a092-c9a349742470-kube-api-access-cndp5\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.010111 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.010025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.010111 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.010099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.010832 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.010346 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.010832 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.010414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7a7aa15c-54fd-4ea3-a092-c9a349742470-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.010832 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.010464 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.010832 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.010655 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.012238 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.012215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7a7aa15c-54fd-4ea3-a092-c9a349742470-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.012553 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.012516 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7a7aa15c-54fd-4ea3-a092-c9a349742470-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.021140 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.021106 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cndp5\" (UniqueName: \"kubernetes.io/projected/7a7aa15c-54fd-4ea3-a092-c9a349742470-kube-api-access-cndp5\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.026815 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.026784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7a7aa15c-54fd-4ea3-a092-c9a349742470-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-clfc7\" (UID: \"7a7aa15c-54fd-4ea3-a092-c9a349742470\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.114336 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.114237 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:44.244789 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:44.244741 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7"] Apr 22 18:52:44.247050 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:52:44.247020 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a7aa15c_54fd_4ea3_a092_c9a349742470.slice/crio-3ecc591f83878ec1383127ad4444e0f730a6e7ff02070a3cbebb34839339dbc1 WatchSource:0}: Error finding container 3ecc591f83878ec1383127ad4444e0f730a6e7ff02070a3cbebb34839339dbc1: Status 404 returned error can't find the container with id 3ecc591f83878ec1383127ad4444e0f730a6e7ff02070a3cbebb34839339dbc1 Apr 22 18:52:45.028591 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:45.028555 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" event={"ID":"7a7aa15c-54fd-4ea3-a092-c9a349742470","Type":"ContainerStarted","Data":"3ecc591f83878ec1383127ad4444e0f730a6e7ff02070a3cbebb34839339dbc1"} Apr 22 18:52:47.622285 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:47.622241 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 22 18:52:47.622563 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:47.622325 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 22 18:52:47.622563 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:47.622350 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 22 18:52:48.039972 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.039934 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" event={"ID":"7a7aa15c-54fd-4ea3-a092-c9a349742470","Type":"ContainerStarted","Data":"dd8a4b10f2ecbbfdce6be2be85ad4eb851780c64a176a1fdac77b9ba153ac73e"} Apr 22 18:52:48.059666 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.059611 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" podStartSLOduration=1.6866159889999999 podStartE2EDuration="5.05959397s" podCreationTimestamp="2026-04-22 18:52:43 +0000 UTC" firstStartedPulling="2026-04-22 18:52:44.248948153 +0000 UTC m=+559.450669013" lastFinishedPulling="2026-04-22 18:52:47.621926136 +0000 UTC m=+562.823646994" observedRunningTime="2026-04-22 18:52:48.058133203 +0000 UTC m=+563.259854096" watchObservedRunningTime="2026-04-22 18:52:48.05959397 +0000 UTC m=+563.261314840" Apr 22 18:52:48.114678 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.114647 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:48.116077 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.116052 2578 patch_prober.go:28] interesting pod/maas-default-gateway-openshift-default-58b6f876-clfc7 container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.132.0.23:15021/healthz/ready\": dial tcp 10.132.0.23:15021: connect: connection refused" start-of-body= Apr 22 18:52:48.116156 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.116105 2578 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" podUID="7a7aa15c-54fd-4ea3-a092-c9a349742470" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.23:15021/healthz/ready\": dial tcp 10.132.0.23:15021: connect: connection refused" Apr 22 18:52:48.800206 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.800162 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lnnvs"] Apr 22 18:52:48.803321 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.803302 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" Apr 22 18:52:48.805494 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.805472 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 18:52:48.805597 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.805492 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-wnkm8\"" Apr 22 18:52:48.809635 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.809616 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lnnvs"] Apr 22 18:52:48.832345 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.829611 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lnnvs"] Apr 22 18:52:48.948279 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.948251 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a9e957f0-c051-42c8-8013-54cdd9ca75f7-config-file\") pod \"limitador-limitador-78c99df468-lnnvs\" (UID: \"a9e957f0-c051-42c8-8013-54cdd9ca75f7\") " pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" Apr 22 18:52:48.948492 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:48.948299 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrjjp\" (UniqueName: \"kubernetes.io/projected/a9e957f0-c051-42c8-8013-54cdd9ca75f7-kube-api-access-jrjjp\") pod \"limitador-limitador-78c99df468-lnnvs\" (UID: \"a9e957f0-c051-42c8-8013-54cdd9ca75f7\") " pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" Apr 22 18:52:49.048989 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:49.048955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrjjp\" (UniqueName: \"kubernetes.io/projected/a9e957f0-c051-42c8-8013-54cdd9ca75f7-kube-api-access-jrjjp\") pod \"limitador-limitador-78c99df468-lnnvs\" (UID: \"a9e957f0-c051-42c8-8013-54cdd9ca75f7\") " pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" Apr 22 18:52:49.049153 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:49.049020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a9e957f0-c051-42c8-8013-54cdd9ca75f7-config-file\") pod \"limitador-limitador-78c99df468-lnnvs\" (UID: \"a9e957f0-c051-42c8-8013-54cdd9ca75f7\") " pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" Apr 22 18:52:49.049571 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:49.049554 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a9e957f0-c051-42c8-8013-54cdd9ca75f7-config-file\") pod \"limitador-limitador-78c99df468-lnnvs\" (UID: \"a9e957f0-c051-42c8-8013-54cdd9ca75f7\") " pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" Apr 22 18:52:49.063405 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:49.063334 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrjjp\" (UniqueName: \"kubernetes.io/projected/a9e957f0-c051-42c8-8013-54cdd9ca75f7-kube-api-access-jrjjp\") pod \"limitador-limitador-78c99df468-lnnvs\" (UID: \"a9e957f0-c051-42c8-8013-54cdd9ca75f7\") " pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" Apr 22 18:52:49.114392 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:49.114358 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" Apr 22 18:52:49.118249 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:49.118224 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:49.234266 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:49.234233 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lnnvs"] Apr 22 18:52:49.236554 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:52:49.236524 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9e957f0_c051_42c8_8013_54cdd9ca75f7.slice/crio-45f84b6feb84eee36a6170739bb63a4fd6ced30e007c990059cf126be18d4024 WatchSource:0}: Error finding container 45f84b6feb84eee36a6170739bb63a4fd6ced30e007c990059cf126be18d4024: Status 404 returned error can't find the container with id 45f84b6feb84eee36a6170739bb63a4fd6ced30e007c990059cf126be18d4024 Apr 22 18:52:50.049388 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:50.049306 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" event={"ID":"a9e957f0-c051-42c8-8013-54cdd9ca75f7","Type":"ContainerStarted","Data":"45f84b6feb84eee36a6170739bb63a4fd6ced30e007c990059cf126be18d4024"} Apr 22 18:52:50.049829 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:50.049686 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:50.050862 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:50.050832 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-clfc7" Apr 22 18:52:52.057671 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:52.057631 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" event={"ID":"a9e957f0-c051-42c8-8013-54cdd9ca75f7","Type":"ContainerStarted","Data":"a9ceca6c861d4ab4db032afaf069604b702713a50b3e2daf3377397780cc2762"} Apr 22 18:52:52.058108 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:52.057883 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" Apr 22 18:52:52.075400 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:52:52.075359 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" podStartSLOduration=1.62779273 podStartE2EDuration="4.075347421s" podCreationTimestamp="2026-04-22 18:52:48 +0000 UTC" firstStartedPulling="2026-04-22 18:52:49.238304116 +0000 UTC m=+564.440024964" lastFinishedPulling="2026-04-22 18:52:51.685858806 +0000 UTC m=+566.887579655" observedRunningTime="2026-04-22 18:52:52.074827687 +0000 UTC m=+567.276548557" watchObservedRunningTime="2026-04-22 18:52:52.075347421 +0000 UTC m=+567.277068310" Apr 22 18:53:03.062239 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:53:03.062171 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-lnnvs" Apr 22 18:53:25.229123 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:53:25.229095 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/ovn-acl-logging/0.log" Apr 22 18:53:25.230017 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:53:25.229999 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/ovn-acl-logging/0.log" Apr 22 18:53:27.276473 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:53:27.276433 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lnnvs"] Apr 22 18:54:06.260463 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:54:06.260425 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lnnvs"] Apr 22 18:54:10.199895 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:54:10.199857 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lnnvs"] Apr 22 18:54:41.382116 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:54:41.382082 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lnnvs"] Apr 22 18:54:48.289188 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:54:48.289132 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lnnvs"] Apr 22 18:54:56.287523 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:54:56.287486 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lnnvs"] Apr 22 18:55:21.582122 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:55:21.582039 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lnnvs"] Apr 22 18:57:04.249404 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:04.249361 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-gpl7n_6a58b8a6-00ae-4197-8abe-ea940d8dc631/manager/0.log" Apr 22 18:57:04.579690 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:04.579654 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-fhwzp_91a1a790-afb3-4384-8461-1d43505b3147/manager/1.log" Apr 22 18:57:04.933540 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:04.933440 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb_183fa1a7-6751-4d87-ade8-b37ca1c36061/manager/0.log" Apr 22 18:57:06.383296 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:06.383264 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-xnl9d_4601a6f4-2fcc-4ce9-a605-829d49dfe318/manager/0.log" Apr 22 18:57:06.946040 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:06.945983 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-lnnvs_a9e957f0-c051-42c8-8013-54cdd9ca75f7/limitador/0.log" Apr 22 18:57:07.503538 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:07.503504 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-5hk6x_32280c0e-a9ce-42fb-85f9-29868b32364e/discovery/0.log" Apr 22 18:57:07.722793 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:07.722763 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6bc9b7f4d-g27ql_0cf3f30b-595c-49c8-9153-e20421137452/kube-auth-proxy/0.log" Apr 22 18:57:07.837673 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:07.837638 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-clfc7_7a7aa15c-54fd-4ea3-a092-c9a349742470/istio-proxy/0.log" Apr 22 18:57:12.955615 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:12.955585 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nb747/must-gather-thz6h"] Apr 22 18:57:12.958976 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:12.958960 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nb747/must-gather-thz6h" Apr 22 18:57:12.961320 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:12.961297 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nb747\"/\"kube-root-ca.crt\"" Apr 22 18:57:12.961320 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:12.961304 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nb747\"/\"openshift-service-ca.crt\"" Apr 22 18:57:12.962057 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:12.962040 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nb747\"/\"default-dockercfg-hb8wb\"" Apr 22 18:57:12.966067 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:12.966048 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nb747/must-gather-thz6h"] Apr 22 18:57:13.083990 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:13.083960 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8979c31-e54b-4cdc-b500-f0455751296d-must-gather-output\") pod \"must-gather-thz6h\" (UID: \"b8979c31-e54b-4cdc-b500-f0455751296d\") " pod="openshift-must-gather-nb747/must-gather-thz6h" Apr 22 18:57:13.084171 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:13.083999 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlvd8\" (UniqueName: \"kubernetes.io/projected/b8979c31-e54b-4cdc-b500-f0455751296d-kube-api-access-hlvd8\") pod \"must-gather-thz6h\" (UID: \"b8979c31-e54b-4cdc-b500-f0455751296d\") " pod="openshift-must-gather-nb747/must-gather-thz6h" Apr 22 18:57:13.185439 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:13.185404 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8979c31-e54b-4cdc-b500-f0455751296d-must-gather-output\") pod \"must-gather-thz6h\" (UID: \"b8979c31-e54b-4cdc-b500-f0455751296d\") " pod="openshift-must-gather-nb747/must-gather-thz6h" Apr 22 18:57:13.185439 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:13.185445 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlvd8\" (UniqueName: \"kubernetes.io/projected/b8979c31-e54b-4cdc-b500-f0455751296d-kube-api-access-hlvd8\") pod \"must-gather-thz6h\" (UID: \"b8979c31-e54b-4cdc-b500-f0455751296d\") " pod="openshift-must-gather-nb747/must-gather-thz6h" Apr 22 18:57:13.185755 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:13.185736 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8979c31-e54b-4cdc-b500-f0455751296d-must-gather-output\") pod \"must-gather-thz6h\" (UID: \"b8979c31-e54b-4cdc-b500-f0455751296d\") " pod="openshift-must-gather-nb747/must-gather-thz6h" Apr 22 18:57:13.194655 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:13.194628 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlvd8\" (UniqueName: \"kubernetes.io/projected/b8979c31-e54b-4cdc-b500-f0455751296d-kube-api-access-hlvd8\") pod \"must-gather-thz6h\" (UID: \"b8979c31-e54b-4cdc-b500-f0455751296d\") " pod="openshift-must-gather-nb747/must-gather-thz6h" Apr 22 18:57:13.268858 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:13.268794 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nb747/must-gather-thz6h" Apr 22 18:57:13.391404 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:13.391357 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nb747/must-gather-thz6h"] Apr 22 18:57:13.393535 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:57:13.393507 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8979c31_e54b_4cdc_b500_f0455751296d.slice/crio-487bb88ad3046f5c30b77a84c43efb18cb955213defed14227887325521b7811 WatchSource:0}: Error finding container 487bb88ad3046f5c30b77a84c43efb18cb955213defed14227887325521b7811: Status 404 returned error can't find the container with id 487bb88ad3046f5c30b77a84c43efb18cb955213defed14227887325521b7811 Apr 22 18:57:13.395308 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:13.395292 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:57:13.927577 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:13.927544 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nb747/must-gather-thz6h" event={"ID":"b8979c31-e54b-4cdc-b500-f0455751296d","Type":"ContainerStarted","Data":"487bb88ad3046f5c30b77a84c43efb18cb955213defed14227887325521b7811"} Apr 22 18:57:14.934329 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:14.934290 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nb747/must-gather-thz6h" event={"ID":"b8979c31-e54b-4cdc-b500-f0455751296d","Type":"ContainerStarted","Data":"d111512a1ff820436b537cffa33222f21de409fba7b175e4de3ef75474e1b2e0"} Apr 22 18:57:14.934780 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:14.934336 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nb747/must-gather-thz6h" event={"ID":"b8979c31-e54b-4cdc-b500-f0455751296d","Type":"ContainerStarted","Data":"2e6689dbdc57f6de908bd25c1ff5ae6f556268e5f012c08491e5a40345317be8"} Apr 22 18:57:14.958149 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:14.958083 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nb747/must-gather-thz6h" podStartSLOduration=2.15390839 podStartE2EDuration="2.958063275s" podCreationTimestamp="2026-04-22 18:57:12 +0000 UTC" firstStartedPulling="2026-04-22 18:57:13.395446825 +0000 UTC m=+828.597167673" lastFinishedPulling="2026-04-22 18:57:14.199601711 +0000 UTC m=+829.401322558" observedRunningTime="2026-04-22 18:57:14.956784489 +0000 UTC m=+830.158505357" watchObservedRunningTime="2026-04-22 18:57:14.958063275 +0000 UTC m=+830.159784145" Apr 22 18:57:15.848776 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:15.848733 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-q9qsg_827aef15-283e-49d7-8df1-ebaee65d73aa/global-pull-secret-syncer/0.log" Apr 22 18:57:15.961811 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:15.961777 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rt5wm_51bbe31f-c966-4131-8425-d7f7a16f402e/konnectivity-agent/0.log" Apr 22 18:57:16.052133 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:16.052101 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-106.ec2.internal_477d8d5a68adc15182b0ab0c3cde7f73/haproxy/0.log" Apr 22 18:57:20.279844 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:20.279818 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-xnl9d_4601a6f4-2fcc-4ce9-a605-829d49dfe318/manager/0.log" Apr 22 18:57:20.434748 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:20.434712 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-lnnvs_a9e957f0-c051-42c8-8013-54cdd9ca75f7/limitador/0.log" Apr 22 18:57:22.304695 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:22.304646 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sb4w9_51a228a5-6cb1-4fe6-a0bd-b497349eee85/node-exporter/0.log" Apr 22 18:57:22.330937 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:22.330898 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sb4w9_51a228a5-6cb1-4fe6-a0bd-b497349eee85/kube-rbac-proxy/0.log" Apr 22 18:57:22.352816 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:22.352775 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sb4w9_51a228a5-6cb1-4fe6-a0bd-b497349eee85/init-textfile/0.log" Apr 22 18:57:24.564779 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.564723 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nb747/perf-node-gather-daemonset-586fr"] Apr 22 18:57:24.572400 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.572373 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.585934 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.585871 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nb747/perf-node-gather-daemonset-586fr"] Apr 22 18:57:24.593609 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.593572 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-lib-modules\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.593769 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.593626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-podres\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.593769 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.593654 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4ptc\" (UniqueName: \"kubernetes.io/projected/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-kube-api-access-g4ptc\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.593769 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.593689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-proc\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.593769 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.593735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-sys\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.694793 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.694758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-lib-modules\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.694963 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.694818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-podres\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.694963 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.694845 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4ptc\" (UniqueName: \"kubernetes.io/projected/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-kube-api-access-g4ptc\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.694963 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.694873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-proc\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.694963 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.694919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-sys\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.694963 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.694921 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-lib-modules\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.694963 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.694953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-proc\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.695300 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.694984 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-sys\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.695368 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.695344 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-podres\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.707257 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.707229 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4ptc\" (UniqueName: \"kubernetes.io/projected/ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd-kube-api-access-g4ptc\") pod \"perf-node-gather-daemonset-586fr\" (UID: \"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:24.889795 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:24.889325 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:25.043687 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:25.043655 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nb747/perf-node-gather-daemonset-586fr"] Apr 22 18:57:25.047415 ip-10-0-135-106 kubenswrapper[2578]: W0422 18:57:25.047361 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podef12b5c7_09bb_4a5a_b0ce_2d8d0c2054bd.slice/crio-58c48eff2ecf59cd9203ce7e6b893881c1abb992a084d1c4b27097c7ae6f071d WatchSource:0}: Error finding container 58c48eff2ecf59cd9203ce7e6b893881c1abb992a084d1c4b27097c7ae6f071d: Status 404 returned error can't find the container with id 58c48eff2ecf59cd9203ce7e6b893881c1abb992a084d1c4b27097c7ae6f071d Apr 22 18:57:25.989021 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:25.988983 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" event={"ID":"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd","Type":"ContainerStarted","Data":"92061a5709cd47e5c1af3e70a5963210414e270ad0ef26b40e9569f3da9652e2"} Apr 22 18:57:25.989478 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:25.989030 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" event={"ID":"ef12b5c7-09bb-4a5a-b0ce-2d8d0c2054bd","Type":"ContainerStarted","Data":"58c48eff2ecf59cd9203ce7e6b893881c1abb992a084d1c4b27097c7ae6f071d"} Apr 22 18:57:25.989478 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:25.989115 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:26.005430 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:26.005372 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" podStartSLOduration=2.005353778 podStartE2EDuration="2.005353778s" podCreationTimestamp="2026-04-22 18:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:26.004652423 +0000 UTC m=+841.206373303" watchObservedRunningTime="2026-04-22 18:57:26.005353778 +0000 UTC m=+841.207074648" Apr 22 18:57:26.622358 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:26.622327 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vkwg8_c89a564f-52a0-4145-8cad-2ae9ecec0329/dns/0.log" Apr 22 18:57:26.642439 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:26.642412 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vkwg8_c89a564f-52a0-4145-8cad-2ae9ecec0329/kube-rbac-proxy/0.log" Apr 22 18:57:26.714138 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:26.714110 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b9bb8_09d3f614-cd83-4e5b-8bb6-06b778d0eda3/dns-node-resolver/0.log" Apr 22 18:57:27.205199 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:27.205152 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6268s_a42ca4fd-ca90-4584-971c-d1d61ff097f6/node-ca/0.log" Apr 22 18:57:28.127287 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:28.127254 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-5hk6x_32280c0e-a9ce-42fb-85f9-29868b32364e/discovery/0.log" Apr 22 18:57:28.170161 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:28.170131 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6bc9b7f4d-g27ql_0cf3f30b-595c-49c8-9153-e20421137452/kube-auth-proxy/0.log" Apr 22 18:57:28.198488 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:28.198460 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-clfc7_7a7aa15c-54fd-4ea3-a092-c9a349742470/istio-proxy/0.log" Apr 22 18:57:28.727052 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:28.727014 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-btplr_f27a9a75-c190-4880-b981-29378f194918/serve-healthcheck-canary/0.log" Apr 22 18:57:29.223669 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:29.223642 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-87l95_19bf46ad-21ba-4f5c-94db-81d68fd09368/kube-rbac-proxy/0.log" Apr 22 18:57:29.245123 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:29.245100 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-87l95_19bf46ad-21ba-4f5c-94db-81d68fd09368/exporter/0.log" Apr 22 18:57:29.265234 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:29.265203 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-87l95_19bf46ad-21ba-4f5c-94db-81d68fd09368/extractor/0.log" Apr 22 18:57:31.240107 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:31.240066 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-gpl7n_6a58b8a6-00ae-4197-8abe-ea940d8dc631/manager/0.log" Apr 22 18:57:31.303763 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:31.303735 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-fhwzp_91a1a790-afb3-4384-8461-1d43505b3147/manager/0.log" Apr 22 18:57:31.313862 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:31.313835 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-fhwzp_91a1a790-afb3-4384-8461-1d43505b3147/manager/1.log" Apr 22 18:57:31.411637 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:31.411606 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7c59bb5d7b-jxlhb_183fa1a7-6751-4d87-ade8-b37ca1c36061/manager/0.log" Apr 22 18:57:32.003648 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:32.003612 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-586fr" Apr 22 18:57:32.541572 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:32.541540 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-crgmd_5dfd09fd-008d-4de9-baec-d1f6850fa4ab/openshift-lws-operator/0.log" Apr 22 18:57:38.803287 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:38.803256 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jvwbm_09c19634-8110-452e-9a84-963e44013755/kube-multus-additional-cni-plugins/0.log" Apr 22 18:57:38.823735 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:38.823701 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jvwbm_09c19634-8110-452e-9a84-963e44013755/egress-router-binary-copy/0.log" Apr 22 18:57:38.844065 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:38.844038 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jvwbm_09c19634-8110-452e-9a84-963e44013755/cni-plugins/0.log" Apr 22 18:57:38.867903 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:38.867877 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jvwbm_09c19634-8110-452e-9a84-963e44013755/bond-cni-plugin/0.log" Apr 22 18:57:38.886997 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:38.886972 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jvwbm_09c19634-8110-452e-9a84-963e44013755/routeoverride-cni/0.log" Apr 22 18:57:38.906709 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:38.906662 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jvwbm_09c19634-8110-452e-9a84-963e44013755/whereabouts-cni-bincopy/0.log" Apr 22 18:57:38.926640 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:38.926615 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jvwbm_09c19634-8110-452e-9a84-963e44013755/whereabouts-cni/0.log" Apr 22 18:57:38.984200 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:38.984129 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sdhdb_f559fab9-b5a3-456c-8531-308e3635428e/kube-multus/0.log" Apr 22 18:57:39.043199 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:39.043158 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gzjvx_111ee8c4-f2a7-4e7b-8faf-15392cc75774/network-metrics-daemon/0.log" Apr 22 18:57:39.064463 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:39.064404 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gzjvx_111ee8c4-f2a7-4e7b-8faf-15392cc75774/kube-rbac-proxy/0.log" Apr 22 18:57:39.938516 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:39.938487 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/ovn-controller/0.log" Apr 22 18:57:39.957460 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:39.957434 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/ovn-acl-logging/0.log" Apr 22 18:57:39.961982 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:39.961960 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/ovn-acl-logging/1.log" Apr 22 18:57:39.982257 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:39.982236 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/kube-rbac-proxy-node/0.log" Apr 22 18:57:40.009594 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:40.009560 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:57:40.026490 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:40.026461 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/northd/0.log" Apr 22 18:57:40.048763 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:40.048730 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/nbdb/0.log" Apr 22 18:57:40.070783 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:40.070747 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/sbdb/0.log" Apr 22 18:57:40.173192 ip-10-0-135-106 kubenswrapper[2578]: I0422 18:57:40.173151 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mm8rp_7cca2fcb-981e-45db-b2b9-8fc7b0d093b4/ovnkube-controller/0.log"