Apr 17 20:01:19.250135 ip-10-0-134-158 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 20:01:19.250147 ip-10-0-134-158 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 20:01:19.250154 ip-10-0-134-158 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 20:01:19.250379 ip-10-0-134-158 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 20:01:30.565432 ip-10-0-134-158 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 20:01:30.565453 ip-10-0-134-158 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 00cf814d25d04b53881e6fcdfb31aa2a -- Apr 17 20:03:37.042243 ip-10-0-134-158 systemd[1]: Starting Kubernetes Kubelet... Apr 17 20:03:37.575825 ip-10-0-134-158 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:03:37.575825 ip-10-0-134-158 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 20:03:37.575825 ip-10-0-134-158 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:03:37.575825 ip-10-0-134-158 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 20:03:37.575825 ip-10-0-134-158 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:03:37.576853 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.576747 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 20:03:37.582170 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582150 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:03:37.582170 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582169 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582174 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582177 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582181 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582184 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582186 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582189 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582192 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582195 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582197 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582200 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582203 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582205 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582208 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582211 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582215 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582217 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582220 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582223 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582225 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:03:37.582245 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582228 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582231 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582234 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582236 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582239 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582242 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582244 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582247 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582251 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582254 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582257 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582261 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582263 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582266 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582269 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582272 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582274 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582277 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582280 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:03:37.582724 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582283 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582285 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582288 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582292 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582296 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582298 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582301 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582303 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582306 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582309 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582311 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582313 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582316 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582318 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582321 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582325 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582327 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582330 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582333 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:03:37.583209 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582335 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582338 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582340 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582343 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582346 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582348 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582351 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582355 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582358 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582361 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582363 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582366 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582369 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582372 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582374 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582377 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582379 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582382 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582384 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582387 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:03:37.583700 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582390 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582392 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582395 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582398 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582400 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582404 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582408 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582840 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582846 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582850 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582853 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582856 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582860 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582863 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582866 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582869 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582871 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582874 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582876 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:03:37.584192 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582879 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582882 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582884 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582887 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582890 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582893 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582895 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582898 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582900 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582903 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582905 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582908 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582911 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582914 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582916 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582919 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582922 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582925 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582927 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582930 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:03:37.584647 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582932 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582935 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582938 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582940 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582943 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582946 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582948 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582950 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582953 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582955 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582958 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582960 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582963 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582965 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582967 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582970 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582972 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582976 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582978 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582981 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:03:37.585174 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582983 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582986 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582988 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582991 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582994 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.582997 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583000 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583002 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583005 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583007 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583010 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583013 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583015 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583018 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583020 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583025 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583029 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583032 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:03:37.585663 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583034 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583037 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583040 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583042 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583045 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583048 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583050 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583053 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583055 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583058 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583061 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583063 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583070 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583073 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583075 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.583078 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584517 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584528 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584535 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584541 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584545 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 20:03:37.586138 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584549 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584553 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584558 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584561 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584564 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584568 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584571 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584574 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584577 2567 flags.go:64] FLAG: --cgroup-root="" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584580 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584583 2567 flags.go:64] FLAG: --client-ca-file="" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584586 2567 flags.go:64] FLAG: --cloud-config="" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584589 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.584592 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585172 2567 flags.go:64] FLAG: --cluster-domain="" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585176 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585180 2567 flags.go:64] FLAG: --config-dir="" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585183 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585187 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585191 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585195 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585198 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585202 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585205 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 17 20:03:37.586652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585209 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585213 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585216 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585219 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585224 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585228 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585230 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585233 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585236 2567 flags.go:64] FLAG: --enable-server="true" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585240 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585244 2567 flags.go:64] FLAG: --event-burst="100" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585247 2567 flags.go:64] FLAG: --event-qps="50" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585250 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585253 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585256 2567 flags.go:64] FLAG: --eviction-hard="" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585260 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585263 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585266 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585269 2567 flags.go:64] FLAG: --eviction-soft="" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585272 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585275 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585278 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585281 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585283 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585286 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 20:03:37.587252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585289 2567 flags.go:64] FLAG: --feature-gates="" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585293 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585296 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585299 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585302 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585306 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585309 2567 flags.go:64] FLAG: --help="false" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585312 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-134-158.ec2.internal" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585315 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585318 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585321 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585325 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585329 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585331 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585334 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585337 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585340 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585343 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585347 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585350 2567 flags.go:64] FLAG: --kube-reserved="" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585353 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585356 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585359 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585362 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 20:03:37.587861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585365 2567 flags.go:64] FLAG: --lock-file="" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585367 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585370 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585373 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585379 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585382 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585385 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585388 2567 flags.go:64] FLAG: --logging-format="text" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585391 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585394 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585397 2567 flags.go:64] FLAG: --manifest-url="" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585400 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585405 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585408 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585413 2567 flags.go:64] FLAG: --max-pods="110" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585416 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585419 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585423 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585426 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585429 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585432 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585435 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585444 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585447 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585450 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 20:03:37.588474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585453 2567 flags.go:64] FLAG: --pod-cidr="" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585457 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585463 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585466 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585469 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585473 2567 flags.go:64] FLAG: --port="10250" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585476 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585479 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e1aaa3d42b3f196d" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585482 2567 flags.go:64] FLAG: --qos-reserved="" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585485 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585488 2567 flags.go:64] FLAG: --register-node="true" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585491 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585494 2567 flags.go:64] FLAG: --register-with-taints="" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585498 2567 flags.go:64] FLAG: --registry-burst="10" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585501 2567 flags.go:64] FLAG: --registry-qps="5" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585504 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585507 2567 flags.go:64] FLAG: --reserved-memory="" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585511 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585514 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585517 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585520 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585523 2567 flags.go:64] FLAG: --runonce="false" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585526 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585529 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585532 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 17 20:03:37.589108 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585536 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585539 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585542 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585545 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585549 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585552 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585555 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585557 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585560 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585563 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585566 2567 flags.go:64] FLAG: --system-cgroups="" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585569 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585574 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585577 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585580 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585584 2567 flags.go:64] FLAG: --tls-min-version="" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585587 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585590 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585593 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585596 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585599 2567 flags.go:64] FLAG: --v="2" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585606 2567 flags.go:64] FLAG: --version="false" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585611 2567 flags.go:64] FLAG: --vmodule="" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585615 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.585619 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 20:03:37.589717 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585721 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585726 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585732 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585734 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585737 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585740 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585743 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585746 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585750 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585758 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585772 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585776 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585779 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585782 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585785 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585787 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585790 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585793 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585795 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:03:37.590337 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585798 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585801 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585803 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585806 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585809 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585811 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585814 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585817 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585819 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585822 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585825 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585827 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585830 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585833 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585836 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585838 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585841 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585844 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585883 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:03:37.590816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585893 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585896 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585900 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585904 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585908 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585911 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585913 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585916 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585919 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585922 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585925 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585927 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585931 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585933 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585936 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585939 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585942 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585944 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585947 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585950 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:03:37.591298 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585952 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585955 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585958 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585961 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585964 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585966 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585969 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585971 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585974 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585977 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585980 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585984 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585988 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585991 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585994 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585996 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.585999 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.586002 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.586004 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.586007 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:03:37.591848 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.586010 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:03:37.592388 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.586012 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:03:37.592388 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.586015 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:03:37.592388 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.586017 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:03:37.592388 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.586020 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:03:37.592388 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.586023 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:03:37.592388 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.586025 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:03:37.592388 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.586028 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:03:37.592388 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.587004 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:03:37.593730 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.593705 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 20:03:37.593785 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.593731 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 20:03:37.593824 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593797 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:03:37.593824 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593804 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:03:37.593824 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593808 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:03:37.593824 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593812 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:03:37.593824 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593815 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:03:37.593824 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593818 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:03:37.593824 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593822 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:03:37.593824 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593825 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:03:37.593824 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593828 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593831 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593833 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593836 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593839 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593842 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593844 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593847 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593850 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593853 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593855 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593858 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593860 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593863 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593865 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593868 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593870 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593873 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593875 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593878 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:03:37.594061 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593880 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593883 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593885 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593889 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593894 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593897 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593900 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593904 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593907 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593910 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593912 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593915 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593918 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593920 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593923 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593926 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593928 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593931 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593934 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593936 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:03:37.594552 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593939 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593942 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593944 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593947 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593949 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593952 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593955 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593958 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593960 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593963 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593966 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593968 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593971 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593973 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593976 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593978 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593981 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593983 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593986 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593988 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:03:37.595074 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593992 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.593996 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594000 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594002 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594005 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594008 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594011 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594014 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594017 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594019 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594022 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594024 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594027 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594029 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594032 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594035 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594037 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:03:37.595574 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594040 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.594046 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594144 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594149 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594152 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594155 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594158 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594160 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594163 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594166 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594169 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594171 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594174 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594177 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594179 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:03:37.596032 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594182 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594186 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594188 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594191 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594194 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594196 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594199 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594203 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594206 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594209 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594212 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594214 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594217 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594219 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594222 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594224 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594226 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594229 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594231 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594234 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:03:37.596402 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594236 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594239 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594241 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594244 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594246 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594249 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594252 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594255 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594257 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594259 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594262 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594265 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594267 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594270 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594273 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594276 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594278 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594280 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594283 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594286 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:03:37.596994 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594288 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594291 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594293 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594296 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594298 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594301 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594303 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594306 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594309 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594311 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594314 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594316 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594319 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594321 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594324 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594327 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594329 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594332 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594335 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594337 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:03:37.597506 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594339 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594342 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594345 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594347 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594350 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594352 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594355 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594363 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594367 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594371 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594374 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594377 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:37.594379 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.594385 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:03:37.598020 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.595241 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 20:03:37.598528 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.598513 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 20:03:37.599433 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.599421 2567 server.go:1019] "Starting client certificate rotation" Apr 17 20:03:37.599543 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.599525 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:03:37.599582 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.599572 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:03:37.629224 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.629053 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:03:37.631959 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.631935 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:03:37.652921 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.652893 2567 log.go:25] "Validated CRI v1 runtime API" Apr 17 20:03:37.659052 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.659026 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:03:37.659219 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.659180 2567 log.go:25] "Validated CRI v1 image API" Apr 17 20:03:37.660842 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.660823 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 20:03:37.663126 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.663103 2567 fs.go:135] Filesystem UUIDs: map[2fd2f452-f774-4c18-8bb3-9c2ffaa74353:/dev/nvme0n1p4 444b7100-44ef-4c8d-8a33-072387b6cb14:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 20:03:37.663196 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.663124 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 20:03:37.669196 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.669082 2567 manager.go:217] Machine: {Timestamp:2026-04-17 20:03:37.666910682 +0000 UTC m=+0.478325558 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100501 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a7d3b56d45fced0a0a2ae267e35a7 SystemUUID:ec2a7d3b-56d4-5fce-d0a0-a2ae267e35a7 BootID:00cf814d-25d0-4b53-881e-6fcdfb31aa2a Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:32:0a:f1:76:15 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:32:0a:f1:76:15 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7a:2f:6e:ef:a8:33 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 20:03:37.669278 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.669215 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 20:03:37.669321 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.669308 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 20:03:37.671310 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.671281 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 20:03:37.671461 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.671313 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-158.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 20:03:37.671508 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.671472 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 20:03:37.671508 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.671480 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 20:03:37.671508 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.671494 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:03:37.672819 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.672808 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:03:37.674274 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.674263 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:03:37.674382 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.674373 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 20:03:37.677232 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.677220 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 17 20:03:37.677282 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.677241 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 20:03:37.677282 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.677254 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 20:03:37.677282 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.677270 2567 kubelet.go:397] "Adding apiserver pod source" Apr 17 20:03:37.677371 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.677287 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 20:03:37.678804 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.678788 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:03:37.678852 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.678817 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:03:37.682394 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.682376 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 20:03:37.684274 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.684260 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 20:03:37.685872 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.685859 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 20:03:37.685929 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.685878 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 20:03:37.685929 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.685885 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 20:03:37.685929 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.685890 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 20:03:37.685929 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.685897 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 20:03:37.685929 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.685903 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 20:03:37.685929 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.685908 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 20:03:37.685929 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.685914 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 20:03:37.685929 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.685921 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 20:03:37.685929 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.685927 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 20:03:37.686159 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.685944 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 20:03:37.686159 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.685953 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 20:03:37.688014 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.688003 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 20:03:37.688014 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.688015 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 20:03:37.690702 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.690673 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 20:03:37.691066 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.691049 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-158.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 20:03:37.691122 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.691049 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-158.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 20:03:37.692017 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.691993 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 20:03:37.692181 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.692170 2567 server.go:1295] "Started kubelet" Apr 17 20:03:37.692720 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.692667 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 20:03:37.692816 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.692755 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 20:03:37.692871 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.692804 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 20:03:37.693315 ip-10-0-134-158 systemd[1]: Started Kubernetes Kubelet. Apr 17 20:03:37.694311 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.694255 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 20:03:37.695744 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.695729 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 17 20:03:37.701111 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.701091 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 20:03:37.701212 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.701114 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 20:03:37.701780 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.701744 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 20:03:37.701780 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.701747 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 20:03:37.701910 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.701791 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 20:03:37.701983 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.701921 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 17 20:03:37.701983 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.701931 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 17 20:03:37.702085 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.702027 2567 factory.go:55] Registering systemd factory Apr 17 20:03:37.702085 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.702077 2567 factory.go:223] Registration of the systemd container factory successfully Apr 17 20:03:37.702376 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.702322 2567 factory.go:153] Registering CRI-O factory Apr 17 20:03:37.702376 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.702333 2567 factory.go:223] Registration of the crio container factory successfully Apr 17 20:03:37.702607 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.702378 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 20:03:37.702607 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.702396 2567 factory.go:103] Registering Raw factory Apr 17 20:03:37.702607 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.702407 2567 manager.go:1196] Started watching for new ooms in manager Apr 17 20:03:37.702865 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.702741 2567 manager.go:319] Starting recovery of all containers Apr 17 20:03:37.703886 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.703500 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 17 20:03:37.703990 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.703951 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 20:03:37.705981 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.705952 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tfpth" Apr 17 20:03:37.709298 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.709265 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 20:03:37.709436 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.709401 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-158.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 20:03:37.710881 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.709690 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-158.ec2.internal.18a73d8496928ff3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-158.ec2.internal,UID:ip-10-0-134-158.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-158.ec2.internal,},FirstTimestamp:2026-04-17 20:03:37.692114931 +0000 UTC m=+0.503529813,LastTimestamp:2026-04-17 20:03:37.692114931 +0000 UTC m=+0.503529813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-158.ec2.internal,}" Apr 17 20:03:37.713200 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.713180 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tfpth" Apr 17 20:03:37.714257 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.714090 2567 manager.go:324] Recovery completed Apr 17 20:03:37.720197 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.720180 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:03:37.722617 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.722600 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:03:37.722712 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.722636 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:03:37.722712 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.722650 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:03:37.723275 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.723260 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 20:03:37.723275 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.723274 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 20:03:37.723389 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.723330 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:03:37.724743 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.724675 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-158.ec2.internal.18a73d849864015d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-158.ec2.internal,UID:ip-10-0-134-158.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-158.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-158.ec2.internal,},FirstTimestamp:2026-04-17 20:03:37.722618205 +0000 UTC m=+0.534033087,LastTimestamp:2026-04-17 20:03:37.722618205 +0000 UTC m=+0.534033087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-158.ec2.internal,}" Apr 17 20:03:37.725319 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.725307 2567 policy_none.go:49] "None policy: Start" Apr 17 20:03:37.725372 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.725324 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 20:03:37.725372 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.725333 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 17 20:03:37.762412 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.762391 2567 manager.go:341] "Starting Device Plugin manager" Apr 17 20:03:37.775124 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.762428 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 20:03:37.775124 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.762443 2567 server.go:85] "Starting device plugin registration server" Apr 17 20:03:37.775124 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.762714 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 20:03:37.775124 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.762746 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 20:03:37.775124 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.762898 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 20:03:37.775124 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.763000 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 20:03:37.775124 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.763016 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 20:03:37.775124 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.763517 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 20:03:37.775124 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.763554 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-158.ec2.internal\" not found" Apr 17 20:03:37.795653 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.795616 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 20:03:37.796749 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.796735 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 20:03:37.796832 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.796772 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 20:03:37.796832 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.796795 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 20:03:37.796832 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.796802 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 20:03:37.796957 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.796833 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 20:03:37.800281 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.800264 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:03:37.863528 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.863461 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:03:37.864407 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.864390 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:03:37.864483 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.864422 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:03:37.864483 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.864433 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:03:37.864483 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.864460 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-158.ec2.internal" Apr 17 20:03:37.873888 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.873866 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-158.ec2.internal" Apr 17 20:03:37.873944 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.873893 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-158.ec2.internal\": node \"ip-10-0-134-158.ec2.internal\" not found" Apr 17 20:03:37.894800 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.894749 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 17 20:03:37.897780 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.897746 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal"] Apr 17 20:03:37.897844 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.897837 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:03:37.901663 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.901640 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:03:37.901799 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.901673 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:03:37.901799 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.901685 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:03:37.902538 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.902522 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d77163023509e27e7eae0d866efd9e46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal\" (UID: \"d77163023509e27e7eae0d866efd9e46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 17 20:03:37.902574 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.902545 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d77163023509e27e7eae0d866efd9e46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal\" (UID: \"d77163023509e27e7eae0d866efd9e46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 17 20:03:37.903976 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.903964 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:03:37.904110 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.904085 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 17 20:03:37.904146 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.904127 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:03:37.904885 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.904870 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:03:37.904963 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.904893 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:03:37.904963 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.904905 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:03:37.904963 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.904915 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:03:37.904963 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.904943 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:03:37.904963 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.904954 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:03:37.907333 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.907317 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" Apr 17 20:03:37.907410 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.907344 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:03:37.908201 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.908188 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:03:37.908264 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.908216 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:03:37.908264 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:37.908226 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:03:37.934698 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.934673 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-158.ec2.internal\" not found" node="ip-10-0-134-158.ec2.internal" Apr 17 20:03:37.939280 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.939261 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-158.ec2.internal\" not found" node="ip-10-0-134-158.ec2.internal" Apr 17 20:03:37.995486 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:37.995450 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 17 20:03:38.002827 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.002805 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d77163023509e27e7eae0d866efd9e46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal\" (UID: \"d77163023509e27e7eae0d866efd9e46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 17 20:03:38.002931 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.002838 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d77163023509e27e7eae0d866efd9e46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal\" (UID: \"d77163023509e27e7eae0d866efd9e46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 17 20:03:38.002931 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.002882 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d77163023509e27e7eae0d866efd9e46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal\" (UID: \"d77163023509e27e7eae0d866efd9e46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 17 20:03:38.002931 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.002910 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d77163023509e27e7eae0d866efd9e46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal\" (UID: \"d77163023509e27e7eae0d866efd9e46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 17 20:03:38.095977 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:38.095941 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 17 20:03:38.103231 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.103205 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/51b4526a209496dd9377d4f989eaa37c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-158.ec2.internal\" (UID: \"51b4526a209496dd9377d4f989eaa37c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" Apr 17 20:03:38.196738 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:38.196655 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 17 20:03:38.204028 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.203996 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/51b4526a209496dd9377d4f989eaa37c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-158.ec2.internal\" (UID: \"51b4526a209496dd9377d4f989eaa37c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" Apr 17 20:03:38.204086 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.204060 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/51b4526a209496dd9377d4f989eaa37c-config\") pod \"kube-apiserver-proxy-ip-10-0-134-158.ec2.internal\" (UID: \"51b4526a209496dd9377d4f989eaa37c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" Apr 17 20:03:38.237186 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.237154 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 17 20:03:38.241801 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.241778 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" Apr 17 20:03:38.297545 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:38.297510 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 17 20:03:38.398093 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:38.398060 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 17 20:03:38.498586 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:38.498554 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 17 20:03:38.545822 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.545794 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:03:38.598969 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:38.598928 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 17 20:03:38.598969 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.598966 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 20:03:38.599668 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.599133 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:03:38.599668 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.599157 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:03:38.699808 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:38.699776 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-158.ec2.internal\" not found" Apr 17 20:03:38.701207 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.701180 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 20:03:38.711305 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.711275 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:03:38.716181 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.716134 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 19:58:37 +0000 UTC" deadline="2027-10-24 20:38:16.546107123 +0000 UTC" Apr 17 20:03:38.716181 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.716176 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13320h34m37.829934282s" Apr 17 20:03:38.733080 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.733046 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ln84j" Apr 17 20:03:38.734439 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.734418 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:03:38.739903 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.739883 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ln84j" Apr 17 20:03:38.753520 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.753452 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:03:38.799281 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:38.799076 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51b4526a209496dd9377d4f989eaa37c.slice/crio-7f16254a0d45d69710849aa3d770e76d5cea701d456f1a8e4fb0d9b2bf42a5c8 WatchSource:0}: Error finding container 7f16254a0d45d69710849aa3d770e76d5cea701d456f1a8e4fb0d9b2bf42a5c8: Status 404 returned error can't find the container with id 7f16254a0d45d69710849aa3d770e76d5cea701d456f1a8e4fb0d9b2bf42a5c8 Apr 17 20:03:38.799938 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:38.799908 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd77163023509e27e7eae0d866efd9e46.slice/crio-9ec3c6d5187eed6d183eefcd171547e0a23715d94591f29633c0851045a509de WatchSource:0}: Error finding container 9ec3c6d5187eed6d183eefcd171547e0a23715d94591f29633c0851045a509de: Status 404 returned error can't find the container with id 9ec3c6d5187eed6d183eefcd171547e0a23715d94591f29633c0851045a509de Apr 17 20:03:38.802013 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.801990 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" Apr 17 20:03:38.803816 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.803800 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:03:38.813685 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.813663 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:03:38.815009 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.814991 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" Apr 17 20:03:38.822940 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:38.822922 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:03:39.678306 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.678118 2567 apiserver.go:52] "Watching apiserver" Apr 17 20:03:39.684920 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.684889 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 20:03:39.685819 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.685789 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-648pj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal","openshift-multus/multus-78p7f","openshift-multus/multus-additional-cni-plugins-q4p8s","openshift-multus/network-metrics-daemon-6vrjk","openshift-network-operator/iptables-alerter-ft42n","kube-system/global-pull-secret-syncer-dq6ll","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg","openshift-dns/node-resolver-m59bq","openshift-image-registry/node-ca-cgt9r","openshift-network-diagnostics/network-check-target-lzj47","openshift-ovn-kubernetes/ovnkube-node-ml4d4","kube-system/konnectivity-agent-6sb8k","kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal"] Apr 17 20:03:39.690778 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.690744 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.692953 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.692932 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6sb8k" Apr 17 20:03:39.695410 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.695389 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:39.695515 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:39.695477 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:03:39.695677 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.695657 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 20:03:39.695877 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.695858 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:03:39.695956 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.695892 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-q6mkk\"" Apr 17 20:03:39.696070 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.696053 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8zd92\"" Apr 17 20:03:39.696460 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.696444 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 20:03:39.696541 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.696524 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 20:03:39.699577 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.699557 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:39.699667 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:39.699623 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:03:39.702544 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.701978 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:39.702544 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:39.702037 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:03:39.706785 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.705718 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.710100 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.708458 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 20:03:39.710100 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.708665 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 20:03:39.710100 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.708731 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 20:03:39.710100 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.708670 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5xzvn\"" Apr 17 20:03:39.710100 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.708959 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 20:03:39.710100 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.709665 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ft42n" Apr 17 20:03:39.710411 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710204 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-systemd\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.710411 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710243 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-multus-socket-dir-parent\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.710411 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710279 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bms7\" (UniqueName: \"kubernetes.io/projected/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-kube-api-access-2bms7\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.710411 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710310 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:39.710411 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710347 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-lib-modules\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.710411 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710396 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b9812077-3cec-42c4-91b6-506bbe029371-etc-tuned\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.710684 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710430 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-multus-cni-dir\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.710684 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710504 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-modprobe-d\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.710684 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710550 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-kubernetes\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.710684 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710597 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-run\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.710684 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710654 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/243940a2-412a-4019-b966-f66af5d78985-agent-certs\") pod \"konnectivity-agent-6sb8k\" (UID: \"243940a2-412a-4019-b966-f66af5d78985\") " pod="kube-system/konnectivity-agent-6sb8k" Apr 17 20:03:39.710947 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710686 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/243940a2-412a-4019-b966-f66af5d78985-konnectivity-ca\") pod \"konnectivity-agent-6sb8k\" (UID: \"243940a2-412a-4019-b966-f66af5d78985\") " pod="kube-system/konnectivity-agent-6sb8k" Apr 17 20:03:39.710947 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710711 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-run-k8s-cni-cncf-io\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.710947 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710784 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-multus-conf-dir\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.710947 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710825 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-multus-daemon-config\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.710947 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710875 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-var-lib-kubelet\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.710947 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710894 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9812077-3cec-42c4-91b6-506bbe029371-tmp\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.710947 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710913 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-system-cni-dir\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.710947 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710944 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-os-release\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.711305 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710973 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-var-lib-cni-bin\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.711305 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.710997 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-hostroot\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.711305 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.711055 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-run-multus-certs\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.711305 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.711092 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-etc-kubernetes\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.711305 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.711129 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-sys\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.712820 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.712005 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.712820 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.712387 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-cni-binary-copy\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.712820 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.712603 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7b977d53-172d-4a66-8807-758a1e1abc45-dbus\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:39.712820 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.712673 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fca79bb6-cb8a-4910-8fd0-c7d340049d44-iptables-alerter-script\") pod \"iptables-alerter-ft42n\" (UID: \"fca79bb6-cb8a-4910-8fd0-c7d340049d44\") " pod="openshift-network-operator/iptables-alerter-ft42n" Apr 17 20:03:39.712820 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.712472 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m59bq" Apr 17 20:03:39.712820 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.712725 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fca79bb6-cb8a-4910-8fd0-c7d340049d44-host-slash\") pod \"iptables-alerter-ft42n\" (UID: \"fca79bb6-cb8a-4910-8fd0-c7d340049d44\") " pod="openshift-network-operator/iptables-alerter-ft42n" Apr 17 20:03:39.712820 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.712756 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcld9\" (UniqueName: \"kubernetes.io/projected/fca79bb6-cb8a-4910-8fd0-c7d340049d44-kube-api-access-dcld9\") pod \"iptables-alerter-ft42n\" (UID: \"fca79bb6-cb8a-4910-8fd0-c7d340049d44\") " pod="openshift-network-operator/iptables-alerter-ft42n" Apr 17 20:03:39.713190 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.712830 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-sysctl-conf\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.713190 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.712867 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-host\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.713190 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.712905 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-run-netns\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.713190 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.712952 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4ksd\" (UniqueName: \"kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd\") pod \"network-check-target-lzj47\" (UID: \"9b5d0119-32c6-4587-994b-0d70198060ea\") " pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:39.713190 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.712999 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-sysconfig\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.713190 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.713036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-sysctl-d\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.713190 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.713064 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-cnibin\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.713190 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.713089 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-var-lib-kubelet\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.713190 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.713118 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7b977d53-172d-4a66-8807-758a1e1abc45-kubelet-config\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:39.713605 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.713218 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:03:39.713703 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.713664 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:39.713790 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.713704 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 20:03:39.713790 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.713711 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvzg5\" (UniqueName: \"kubernetes.io/projected/b9812077-3cec-42c4-91b6-506bbe029371-kube-api-access-rvzg5\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.713790 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.713742 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-var-lib-cni-multus\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.713935 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.713806 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nl7q\" (UniqueName: \"kubernetes.io/projected/3ab88728-120f-4d07-91b8-97fe1307e061-kube-api-access-6nl7q\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:39.717433 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.717412 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 20:03:39.717433 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.717427 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qrlkt\"" Apr 17 20:03:39.717931 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.717912 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 20:03:39.718005 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.717980 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cgt9r" Apr 17 20:03:39.718057 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.718013 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mwk7d\"" Apr 17 20:03:39.718539 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.718099 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 20:03:39.718657 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.718642 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 20:03:39.720374 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.720353 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hbspv\"" Apr 17 20:03:39.720559 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.720539 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 20:03:39.720826 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.720800 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 20:03:39.720918 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.720860 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 20:03:39.720975 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.720960 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 20:03:39.721098 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.721081 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 20:03:39.721308 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.721293 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7mfpt\"" Apr 17 20:03:39.722258 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.722240 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.724668 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.724356 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 20:03:39.724777 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.724740 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dkw2q\"" Apr 17 20:03:39.724854 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.724806 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.725348 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.725287 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 20:03:39.727136 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.726792 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 20:03:39.727791 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.727686 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 20:03:39.727909 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.727826 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 20:03:39.727909 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.727859 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vgm4k\"" Apr 17 20:03:39.728050 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.727959 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 20:03:39.728050 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.728014 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 20:03:39.728320 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.728303 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 20:03:39.740794 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.740738 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 19:58:38 +0000 UTC" deadline="2027-11-24 20:48:25.542542137 +0000 UTC" Apr 17 20:03:39.740794 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.740792 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14064h44m45.801755096s" Apr 17 20:03:39.802724 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.802655 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 20:03:39.803026 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.802979 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" event={"ID":"d77163023509e27e7eae0d866efd9e46","Type":"ContainerStarted","Data":"9ec3c6d5187eed6d183eefcd171547e0a23715d94591f29633c0851045a509de"} Apr 17 20:03:39.804701 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.804653 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" event={"ID":"51b4526a209496dd9377d4f989eaa37c","Type":"ContainerStarted","Data":"7f16254a0d45d69710849aa3d770e76d5cea701d456f1a8e4fb0d9b2bf42a5c8"} Apr 17 20:03:39.814321 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.814281 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ksd\" (UniqueName: \"kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd\") pod \"network-check-target-lzj47\" (UID: \"9b5d0119-32c6-4587-994b-0d70198060ea\") " pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:39.814456 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.814324 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-sys-fs\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.814456 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.814385 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9812077-3cec-42c4-91b6-506bbe029371-tmp\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.814456 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.814420 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-os-release\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.814456 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.814454 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-etc-kubernetes\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.815057 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815025 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-system-cni-dir\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.815185 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815092 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-os-release\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.815185 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815140 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-os-release\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.815185 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815155 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-run-openvswitch\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.815307 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815201 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.815307 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815248 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-cnibin\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.815307 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815279 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f323355-87cd-4d74-ba77-22d401a93474-ovnkube-config\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.815448 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815331 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz4l8\" (UniqueName: \"kubernetes.io/projected/9f323355-87cd-4d74-ba77-22d401a93474-kube-api-access-gz4l8\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.815448 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815368 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcld9\" (UniqueName: \"kubernetes.io/projected/fca79bb6-cb8a-4910-8fd0-c7d340049d44-kube-api-access-dcld9\") pod \"iptables-alerter-ft42n\" (UID: \"fca79bb6-cb8a-4910-8fd0-c7d340049d44\") " pod="openshift-network-operator/iptables-alerter-ft42n" Apr 17 20:03:39.815448 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815399 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-sysctl-conf\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.815448 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815429 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-host\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.815448 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815434 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-etc-kubernetes\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.815680 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815464 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-kubelet\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.815680 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815499 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-run-systemd\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.815680 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815522 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-cni-bin\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.815892 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815861 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-sysctl-d\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.815892 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815601 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-sysctl-conf\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.816011 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815722 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-host\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.816011 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.815518 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 20:03:39.816470 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.816424 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7b977d53-172d-4a66-8807-758a1e1abc45-kubelet-config\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:39.816550 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.816475 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:39.816610 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.816587 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-cni-binary-copy\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.816667 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.816615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f323355-87cd-4d74-ba77-22d401a93474-ovnkube-script-lib\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.816667 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.816643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvzg5\" (UniqueName: \"kubernetes.io/projected/b9812077-3cec-42c4-91b6-506bbe029371-kube-api-access-rvzg5\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.816667 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.816666 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-var-lib-cni-multus\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.816936 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.816915 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-var-lib-cni-multus\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.817044 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.816946 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7b977d53-172d-4a66-8807-758a1e1abc45-kubelet-config\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:39.817113 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817066 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.817113 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:39.817103 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:03:39.817223 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:39.817210 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret podName:7b977d53-172d-4a66-8807-758a1e1abc45 nodeName:}" failed. No retries permitted until 2026-04-17 20:03:40.317158003 +0000 UTC m=+3.128572887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret") pod "global-pull-secret-syncer-dq6ll" (UID: "7b977d53-172d-4a66-8807-758a1e1abc45") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:03:39.817223 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817136 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-var-lib-openvswitch\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.817327 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817246 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-sysctl-d\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.817411 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817368 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-node-log\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.817477 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817462 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-systemd\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.817554 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817522 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bms7\" (UniqueName: \"kubernetes.io/projected/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-kube-api-access-2bms7\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.817604 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817580 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:39.817698 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817664 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-socket-dir\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.817698 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817691 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-systemd\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.817803 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817742 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-run-netns\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.817803 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817795 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-etc-openvswitch\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.817901 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817817 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-log-socket\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.817901 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817872 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-kubernetes\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.817987 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.817939 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/243940a2-412a-4019-b966-f66af5d78985-agent-certs\") pod \"konnectivity-agent-6sb8k\" (UID: \"243940a2-412a-4019-b966-f66af5d78985\") " pod="kube-system/konnectivity-agent-6sb8k" Apr 17 20:03:39.818059 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:39.817987 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:03:39.818059 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/243940a2-412a-4019-b966-f66af5d78985-konnectivity-ca\") pod \"konnectivity-agent-6sb8k\" (UID: \"243940a2-412a-4019-b966-f66af5d78985\") " pod="kube-system/konnectivity-agent-6sb8k" Apr 17 20:03:39.818192 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818170 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-kubernetes\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.818250 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:39.818233 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs podName:3ab88728-120f-4d07-91b8-97fe1307e061 nodeName:}" failed. No retries permitted until 2026-04-17 20:03:40.318207674 +0000 UTC m=+3.129622550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs") pod "network-metrics-daemon-6vrjk" (UID: "3ab88728-120f-4d07-91b8-97fe1307e061") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:03:39.818328 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818257 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-run-k8s-cni-cncf-io\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.818328 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818320 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-multus-daemon-config\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.818416 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818397 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-run-multus-certs\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.818463 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818436 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj2fm\" (UniqueName: \"kubernetes.io/projected/2281133e-34d1-4450-acb3-cbecb7262008-kube-api-access-fj2fm\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.818514 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818475 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.818576 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818506 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-var-lib-kubelet\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.818640 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818604 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-system-cni-dir\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.818693 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818654 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-var-lib-cni-bin\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.818693 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818657 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-run-k8s-cni-cncf-io\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.818801 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818705 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-hostroot\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.818801 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818742 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-system-cni-dir\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.818801 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818744 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-device-dir\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.818801 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818801 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-var-lib-kubelet\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.818801 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818813 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/153e1d90-2d3e-4083-88f9-781771f16266-tmp-dir\") pod \"node-resolver-m59bq\" (UID: \"153e1d90-2d3e-4083-88f9-781771f16266\") " pod="openshift-dns/node-resolver-m59bq" Apr 17 20:03:39.819038 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818828 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-var-lib-cni-bin\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.819038 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818849 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-run-multus-certs\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.819038 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818850 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.819038 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-sys\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.819038 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818913 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-cni-binary-copy\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.819038 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.818993 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7b977d53-172d-4a66-8807-758a1e1abc45-dbus\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:39.819038 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819026 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.819321 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819050 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/153e1d90-2d3e-4083-88f9-781771f16266-hosts-file\") pod \"node-resolver-m59bq\" (UID: \"153e1d90-2d3e-4083-88f9-781771f16266\") " pod="openshift-dns/node-resolver-m59bq" Apr 17 20:03:39.819321 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819078 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0286d3b0-9e6f-498c-b50c-69d5149b3f0d-host\") pod \"node-ca-cgt9r\" (UID: \"0286d3b0-9e6f-498c-b50c-69d5149b3f0d\") " pod="openshift-image-registry/node-ca-cgt9r" Apr 17 20:03:39.819321 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819117 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvdjv\" (UniqueName: \"kubernetes.io/projected/0286d3b0-9e6f-498c-b50c-69d5149b3f0d-kube-api-access-bvdjv\") pod \"node-ca-cgt9r\" (UID: \"0286d3b0-9e6f-498c-b50c-69d5149b3f0d\") " pod="openshift-image-registry/node-ca-cgt9r" Apr 17 20:03:39.819321 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819152 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshsw\" (UniqueName: \"kubernetes.io/projected/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-kube-api-access-wshsw\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.819321 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819202 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fca79bb6-cb8a-4910-8fd0-c7d340049d44-iptables-alerter-script\") pod \"iptables-alerter-ft42n\" (UID: \"fca79bb6-cb8a-4910-8fd0-c7d340049d44\") " pod="openshift-network-operator/iptables-alerter-ft42n" Apr 17 20:03:39.819321 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819236 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fca79bb6-cb8a-4910-8fd0-c7d340049d44-host-slash\") pod \"iptables-alerter-ft42n\" (UID: \"fca79bb6-cb8a-4910-8fd0-c7d340049d44\") " pod="openshift-network-operator/iptables-alerter-ft42n" Apr 17 20:03:39.819321 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819286 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-multus-daemon-config\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.819624 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819347 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-run-netns\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.819624 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819382 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-etc-selinux\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.819624 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819392 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fca79bb6-cb8a-4910-8fd0-c7d340049d44-host-slash\") pod \"iptables-alerter-ft42n\" (UID: \"fca79bb6-cb8a-4910-8fd0-c7d340049d44\") " pod="openshift-network-operator/iptables-alerter-ft42n" Apr 17 20:03:39.819624 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819411 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0286d3b0-9e6f-498c-b50c-69d5149b3f0d-serviceca\") pod \"node-ca-cgt9r\" (UID: \"0286d3b0-9e6f-498c-b50c-69d5149b3f0d\") " pod="openshift-image-registry/node-ca-cgt9r" Apr 17 20:03:39.819624 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819449 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-sys\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.819624 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819471 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-run-netns\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.819624 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819583 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-hostroot\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.819624 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819615 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7b977d53-172d-4a66-8807-758a1e1abc45-dbus\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:39.820007 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819634 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f323355-87cd-4d74-ba77-22d401a93474-env-overrides\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.820007 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819705 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-sysconfig\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.820007 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.819826 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-cnibin\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.820148 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820090 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-cni-binary-copy\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.820148 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820101 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fca79bb6-cb8a-4910-8fd0-c7d340049d44-iptables-alerter-script\") pod \"iptables-alerter-ft42n\" (UID: \"fca79bb6-cb8a-4910-8fd0-c7d340049d44\") " pod="openshift-network-operator/iptables-alerter-ft42n" Apr 17 20:03:39.820300 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820283 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-cnibin\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.820367 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820316 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-var-lib-kubelet\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.820367 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820352 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-sysconfig\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.820493 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820438 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nl7q\" (UniqueName: \"kubernetes.io/projected/3ab88728-120f-4d07-91b8-97fe1307e061-kube-api-access-6nl7q\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:39.820493 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820482 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-systemd-units\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.820611 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820490 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-host-var-lib-kubelet\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.820611 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820550 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-slash\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.820611 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820600 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-multus-socket-dir-parent\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.820785 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820631 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-registration-dir\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.820785 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820693 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-run-ovn-kubernetes\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.820899 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820787 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/243940a2-412a-4019-b966-f66af5d78985-konnectivity-ca\") pod \"konnectivity-agent-6sb8k\" (UID: \"243940a2-412a-4019-b966-f66af5d78985\") " pod="kube-system/konnectivity-agent-6sb8k" Apr 17 20:03:39.820899 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820832 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-cni-netd\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.821051 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820893 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-multus-socket-dir-parent\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.821051 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820916 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-lib-modules\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.821051 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.820952 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b9812077-3cec-42c4-91b6-506bbe029371-etc-tuned\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.821051 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.821023 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-multus-cni-dir\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.821282 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.821078 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-lib-modules\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.821282 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.821120 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-multus-cni-dir\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.821282 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.821128 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-run-ovn\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.821282 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.821176 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f323355-87cd-4d74-ba77-22d401a93474-ovn-node-metrics-cert\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.821282 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.821229 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pvnt\" (UniqueName: \"kubernetes.io/projected/153e1d90-2d3e-4083-88f9-781771f16266-kube-api-access-4pvnt\") pod \"node-resolver-m59bq\" (UID: \"153e1d90-2d3e-4083-88f9-781771f16266\") " pod="openshift-dns/node-resolver-m59bq" Apr 17 20:03:39.821560 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.821298 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-modprobe-d\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.821560 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.821405 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-etc-modprobe-d\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.821560 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.821455 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-run\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.821560 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.821503 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-multus-conf-dir\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.821822 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.821557 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b9812077-3cec-42c4-91b6-506bbe029371-run\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.821822 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.821633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-multus-conf-dir\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.822553 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.822508 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b9812077-3cec-42c4-91b6-506bbe029371-tmp\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.822788 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:39.822750 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:03:39.822902 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:39.822795 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:03:39.822902 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:39.822813 2567 projected.go:194] Error preparing data for projected volume kube-api-access-l4ksd for pod openshift-network-diagnostics/network-check-target-lzj47: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:03:39.822902 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:39.822876 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd podName:9b5d0119-32c6-4587-994b-0d70198060ea nodeName:}" failed. No retries permitted until 2026-04-17 20:03:40.322858342 +0000 UTC m=+3.134273222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l4ksd" (UniqueName: "kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd") pod "network-check-target-lzj47" (UID: "9b5d0119-32c6-4587-994b-0d70198060ea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:03:39.823964 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.823917 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b9812077-3cec-42c4-91b6-506bbe029371-etc-tuned\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.825472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.825226 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/243940a2-412a-4019-b966-f66af5d78985-agent-certs\") pod \"konnectivity-agent-6sb8k\" (UID: \"243940a2-412a-4019-b966-f66af5d78985\") " pod="kube-system/konnectivity-agent-6sb8k" Apr 17 20:03:39.825472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.825409 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcld9\" (UniqueName: \"kubernetes.io/projected/fca79bb6-cb8a-4910-8fd0-c7d340049d44-kube-api-access-dcld9\") pod \"iptables-alerter-ft42n\" (UID: \"fca79bb6-cb8a-4910-8fd0-c7d340049d44\") " pod="openshift-network-operator/iptables-alerter-ft42n" Apr 17 20:03:39.825985 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.825958 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvzg5\" (UniqueName: \"kubernetes.io/projected/b9812077-3cec-42c4-91b6-506bbe029371-kube-api-access-rvzg5\") pod \"tuned-648pj\" (UID: \"b9812077-3cec-42c4-91b6-506bbe029371\") " pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:39.829607 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.829568 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bms7\" (UniqueName: \"kubernetes.io/projected/3abe62da-aef2-4ef2-85a2-278e4f8fe4c1-kube-api-access-2bms7\") pod \"multus-78p7f\" (UID: \"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1\") " pod="openshift-multus/multus-78p7f" Apr 17 20:03:39.830516 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.830474 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nl7q\" (UniqueName: \"kubernetes.io/projected/3ab88728-120f-4d07-91b8-97fe1307e061-kube-api-access-6nl7q\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:39.922979 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.922807 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-os-release\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.922979 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.922871 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-run-openvswitch\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.922979 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.922901 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.922979 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.922931 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-os-release\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.922979 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.922932 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-cnibin\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.922979 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.922983 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-cnibin\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923001 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f323355-87cd-4d74-ba77-22d401a93474-ovnkube-config\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923030 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz4l8\" (UniqueName: \"kubernetes.io/projected/9f323355-87cd-4d74-ba77-22d401a93474-kube-api-access-gz4l8\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923058 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923066 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-kubelet\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923100 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-run-openvswitch\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923104 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-run-systemd\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923135 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-cni-bin\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923176 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-cni-binary-copy\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f323355-87cd-4d74-ba77-22d401a93474-ovnkube-script-lib\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923230 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923258 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-var-lib-openvswitch\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923281 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-node-log\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923320 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-socket-dir\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923359 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-run-netns\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923383 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-etc-openvswitch\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923429 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-kubelet\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.923472 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923443 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-run-systemd\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923497 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-etc-openvswitch\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923563 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-cni-bin\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923606 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-node-log\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923692 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f323355-87cd-4d74-ba77-22d401a93474-ovnkube-config\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923792 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-var-lib-openvswitch\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923827 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-socket-dir\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923841 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-run-netns\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923863 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-log-socket\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923900 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fj2fm\" (UniqueName: \"kubernetes.io/projected/2281133e-34d1-4450-acb3-cbecb7262008-kube-api-access-fj2fm\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923948 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-log-socket\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923991 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f323355-87cd-4d74-ba77-22d401a93474-ovnkube-script-lib\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923999 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924042 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-device-dir\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.923991 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-cni-binary-copy\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924069 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/153e1d90-2d3e-4083-88f9-781771f16266-tmp-dir\") pod \"node-resolver-m59bq\" (UID: \"153e1d90-2d3e-4083-88f9-781771f16266\") " pod="openshift-dns/node-resolver-m59bq" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924077 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.924175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924111 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924129 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-device-dir\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924138 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924161 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924212 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/153e1d90-2d3e-4083-88f9-781771f16266-hosts-file\") pod \"node-resolver-m59bq\" (UID: \"153e1d90-2d3e-4083-88f9-781771f16266\") " pod="openshift-dns/node-resolver-m59bq" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924236 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0286d3b0-9e6f-498c-b50c-69d5149b3f0d-host\") pod \"node-ca-cgt9r\" (UID: \"0286d3b0-9e6f-498c-b50c-69d5149b3f0d\") " pod="openshift-image-registry/node-ca-cgt9r" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924261 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvdjv\" (UniqueName: \"kubernetes.io/projected/0286d3b0-9e6f-498c-b50c-69d5149b3f0d-kube-api-access-bvdjv\") pod \"node-ca-cgt9r\" (UID: \"0286d3b0-9e6f-498c-b50c-69d5149b3f0d\") " pod="openshift-image-registry/node-ca-cgt9r" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924274 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924283 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/153e1d90-2d3e-4083-88f9-781771f16266-hosts-file\") pod \"node-resolver-m59bq\" (UID: \"153e1d90-2d3e-4083-88f9-781771f16266\") " pod="openshift-dns/node-resolver-m59bq" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924286 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wshsw\" (UniqueName: \"kubernetes.io/projected/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-kube-api-access-wshsw\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924332 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/153e1d90-2d3e-4083-88f9-781771f16266-tmp-dir\") pod \"node-resolver-m59bq\" (UID: \"153e1d90-2d3e-4083-88f9-781771f16266\") " pod="openshift-dns/node-resolver-m59bq" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924340 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-etc-selinux\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924366 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0286d3b0-9e6f-498c-b50c-69d5149b3f0d-serviceca\") pod \"node-ca-cgt9r\" (UID: \"0286d3b0-9e6f-498c-b50c-69d5149b3f0d\") " pod="openshift-image-registry/node-ca-cgt9r" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924393 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f323355-87cd-4d74-ba77-22d401a93474-env-overrides\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924420 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-systemd-units\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924439 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-etc-selinux\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924446 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-slash\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.925045 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924480 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-slash\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924484 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-registration-dir\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924523 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-run-ovn-kubernetes\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924544 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-registration-dir\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924582 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-systemd-units\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924612 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-cni-netd\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924624 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924644 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-run-ovn\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924678 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f323355-87cd-4d74-ba77-22d401a93474-ovn-node-metrics-cert\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924684 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-cni-netd\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924684 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0286d3b0-9e6f-498c-b50c-69d5149b3f0d-host\") pod \"node-ca-cgt9r\" (UID: \"0286d3b0-9e6f-498c-b50c-69d5149b3f0d\") " pod="openshift-image-registry/node-ca-cgt9r" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924726 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pvnt\" (UniqueName: \"kubernetes.io/projected/153e1d90-2d3e-4083-88f9-781771f16266-kube-api-access-4pvnt\") pod \"node-resolver-m59bq\" (UID: \"153e1d90-2d3e-4083-88f9-781771f16266\") " pod="openshift-dns/node-resolver-m59bq" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924792 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-sys-fs\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924826 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-system-cni-dir\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924891 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0286d3b0-9e6f-498c-b50c-69d5149b3f0d-serviceca\") pod \"node-ca-cgt9r\" (UID: \"0286d3b0-9e6f-498c-b50c-69d5149b3f0d\") " pod="openshift-image-registry/node-ca-cgt9r" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924902 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-system-cni-dir\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924923 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f323355-87cd-4d74-ba77-22d401a93474-env-overrides\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.925806 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924946 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-run-ovn\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.926501 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924959 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2281133e-34d1-4450-acb3-cbecb7262008-sys-fs\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.926501 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.924965 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f323355-87cd-4d74-ba77-22d401a93474-host-run-ovn-kubernetes\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.927367 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.927344 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f323355-87cd-4d74-ba77-22d401a93474-ovn-node-metrics-cert\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.932095 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.931983 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pvnt\" (UniqueName: \"kubernetes.io/projected/153e1d90-2d3e-4083-88f9-781771f16266-kube-api-access-4pvnt\") pod \"node-resolver-m59bq\" (UID: \"153e1d90-2d3e-4083-88f9-781771f16266\") " pod="openshift-dns/node-resolver-m59bq" Apr 17 20:03:39.932205 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.932097 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz4l8\" (UniqueName: \"kubernetes.io/projected/9f323355-87cd-4d74-ba77-22d401a93474-kube-api-access-gz4l8\") pod \"ovnkube-node-ml4d4\" (UID: \"9f323355-87cd-4d74-ba77-22d401a93474\") " pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:39.932205 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.932169 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj2fm\" (UniqueName: \"kubernetes.io/projected/2281133e-34d1-4450-acb3-cbecb7262008-kube-api-access-fj2fm\") pod \"aws-ebs-csi-driver-node-kb8dg\" (UID: \"2281133e-34d1-4450-acb3-cbecb7262008\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:39.932892 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.932865 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshsw\" (UniqueName: \"kubernetes.io/projected/a4667d02-88e0-4ffd-a42f-77c06bdf9c21-kube-api-access-wshsw\") pod \"multus-additional-cni-plugins-q4p8s\" (UID: \"a4667d02-88e0-4ffd-a42f-77c06bdf9c21\") " pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:39.933015 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:39.932914 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvdjv\" (UniqueName: \"kubernetes.io/projected/0286d3b0-9e6f-498c-b50c-69d5149b3f0d-kube-api-access-bvdjv\") pod \"node-ca-cgt9r\" (UID: \"0286d3b0-9e6f-498c-b50c-69d5149b3f0d\") " pod="openshift-image-registry/node-ca-cgt9r" Apr 17 20:03:40.007090 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.007046 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-648pj" Apr 17 20:03:40.022855 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.022830 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6sb8k" Apr 17 20:03:40.032599 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.032566 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-78p7f" Apr 17 20:03:40.040316 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.040285 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ft42n" Apr 17 20:03:40.049061 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.049036 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" Apr 17 20:03:40.056737 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.056714 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m59bq" Apr 17 20:03:40.065437 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.065409 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cgt9r" Apr 17 20:03:40.074115 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.074092 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q4p8s" Apr 17 20:03:40.081916 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.081889 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:03:40.095470 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.095444 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:03:40.328075 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.328047 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:40.328075 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.328083 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:40.328350 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.328127 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ksd\" (UniqueName: \"kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd\") pod \"network-check-target-lzj47\" (UID: \"9b5d0119-32c6-4587-994b-0d70198060ea\") " pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:40.328350 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:40.328190 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:03:40.328350 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:40.328238 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:03:40.328350 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:40.328264 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret podName:7b977d53-172d-4a66-8807-758a1e1abc45 nodeName:}" failed. No retries permitted until 2026-04-17 20:03:41.328244906 +0000 UTC m=+4.139659770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret") pod "global-pull-secret-syncer-dq6ll" (UID: "7b977d53-172d-4a66-8807-758a1e1abc45") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:03:40.328350 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:40.328287 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs podName:3ab88728-120f-4d07-91b8-97fe1307e061 nodeName:}" failed. No retries permitted until 2026-04-17 20:03:41.32827544 +0000 UTC m=+4.139690306 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs") pod "network-metrics-daemon-6vrjk" (UID: "3ab88728-120f-4d07-91b8-97fe1307e061") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:03:40.328350 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:40.328239 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:03:40.328350 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:40.328309 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:03:40.328350 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:40.328322 2567 projected.go:194] Error preparing data for projected volume kube-api-access-l4ksd for pod openshift-network-diagnostics/network-check-target-lzj47: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:03:40.328350 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:40.328356 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd podName:9b5d0119-32c6-4587-994b-0d70198060ea nodeName:}" failed. No retries permitted until 2026-04-17 20:03:41.328346046 +0000 UTC m=+4.139760926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-l4ksd" (UniqueName: "kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd") pod "network-check-target-lzj47" (UID: "9b5d0119-32c6-4587-994b-0d70198060ea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:03:40.455382 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.455348 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:03:40.527844 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:40.527813 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3abe62da_aef2_4ef2_85a2_278e4f8fe4c1.slice/crio-2a631deafd6913804f145dbbe765af0286e0a9aea58ce425a8e152f8bfc0ebb3 WatchSource:0}: Error finding container 2a631deafd6913804f145dbbe765af0286e0a9aea58ce425a8e152f8bfc0ebb3: Status 404 returned error can't find the container with id 2a631deafd6913804f145dbbe765af0286e0a9aea58ce425a8e152f8bfc0ebb3 Apr 17 20:03:40.528719 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:40.528698 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2281133e_34d1_4450_acb3_cbecb7262008.slice/crio-eb450b4f6ef136983b3e8cc830b7302866e7ca6062c59295f396410b33b3fe8e WatchSource:0}: Error finding container eb450b4f6ef136983b3e8cc830b7302866e7ca6062c59295f396410b33b3fe8e: Status 404 returned error can't find the container with id eb450b4f6ef136983b3e8cc830b7302866e7ca6062c59295f396410b33b3fe8e Apr 17 20:03:40.529971 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:40.529944 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod243940a2_412a_4019_b966_f66af5d78985.slice/crio-ca6fff27eb4b48db578990172a81cde99045ad2e5e69c2ca517d6bf52282fa32 WatchSource:0}: Error finding container ca6fff27eb4b48db578990172a81cde99045ad2e5e69c2ca517d6bf52282fa32: Status 404 returned error can't find the container with id ca6fff27eb4b48db578990172a81cde99045ad2e5e69c2ca517d6bf52282fa32 Apr 17 20:03:40.533055 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:40.533028 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f323355_87cd_4d74_ba77_22d401a93474.slice/crio-8c826ef690c1212215ecb368fe4cd28bea26db7f19c3d4d2443c47adf51bf2c0 WatchSource:0}: Error finding container 8c826ef690c1212215ecb368fe4cd28bea26db7f19c3d4d2443c47adf51bf2c0: Status 404 returned error can't find the container with id 8c826ef690c1212215ecb368fe4cd28bea26db7f19c3d4d2443c47adf51bf2c0 Apr 17 20:03:40.536203 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:40.536149 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9812077_3cec_42c4_91b6_506bbe029371.slice/crio-05e870dc44239d42afcf437963a4a549c8ac2e3cd238da7a7a249815f7a76b4b WatchSource:0}: Error finding container 05e870dc44239d42afcf437963a4a549c8ac2e3cd238da7a7a249815f7a76b4b: Status 404 returned error can't find the container with id 05e870dc44239d42afcf437963a4a549c8ac2e3cd238da7a7a249815f7a76b4b Apr 17 20:03:40.538144 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:40.537794 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod153e1d90_2d3e_4083_88f9_781771f16266.slice/crio-3bab17eb65aceaf48dc64e8d9694f4baf585f8f578539252df92876e06d7d8b1 WatchSource:0}: Error finding container 3bab17eb65aceaf48dc64e8d9694f4baf585f8f578539252df92876e06d7d8b1: Status 404 returned error can't find the container with id 3bab17eb65aceaf48dc64e8d9694f4baf585f8f578539252df92876e06d7d8b1 Apr 17 20:03:40.538610 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:40.538520 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4667d02_88e0_4ffd_a42f_77c06bdf9c21.slice/crio-624b5ba80e509dba54ea973228eeccc2e27ff2a5463a63d232f423c71178b66c WatchSource:0}: Error finding container 624b5ba80e509dba54ea973228eeccc2e27ff2a5463a63d232f423c71178b66c: Status 404 returned error can't find the container with id 624b5ba80e509dba54ea973228eeccc2e27ff2a5463a63d232f423c71178b66c Apr 17 20:03:40.539376 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:40.539355 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0286d3b0_9e6f_498c_b50c_69d5149b3f0d.slice/crio-1be03a89a1780badefd1e25c2bbe47dc10d2efeab4e884cf79f60b0cdae1d5b5 WatchSource:0}: Error finding container 1be03a89a1780badefd1e25c2bbe47dc10d2efeab4e884cf79f60b0cdae1d5b5: Status 404 returned error can't find the container with id 1be03a89a1780badefd1e25c2bbe47dc10d2efeab4e884cf79f60b0cdae1d5b5 Apr 17 20:03:40.540348 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:03:40.540283 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfca79bb6_cb8a_4910_8fd0_c7d340049d44.slice/crio-59fb842efa437003e64494b35413e20fc54228831d6fd5619acfe434f7e7d68b WatchSource:0}: Error finding container 59fb842efa437003e64494b35413e20fc54228831d6fd5619acfe434f7e7d68b: Status 404 returned error can't find the container with id 59fb842efa437003e64494b35413e20fc54228831d6fd5619acfe434f7e7d68b Apr 17 20:03:40.741258 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.741216 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 19:58:38 +0000 UTC" deadline="2027-10-29 22:02:34.259729644 +0000 UTC" Apr 17 20:03:40.741258 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.741249 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13441h58m53.518483108s" Apr 17 20:03:40.810109 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.810026 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" event={"ID":"51b4526a209496dd9377d4f989eaa37c","Type":"ContainerStarted","Data":"21290249e790c09643ec138ef32a3c3bd2a945b522063035d4a5329e0c4b7ccb"} Apr 17 20:03:40.811102 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.811065 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m59bq" event={"ID":"153e1d90-2d3e-4083-88f9-781771f16266","Type":"ContainerStarted","Data":"3bab17eb65aceaf48dc64e8d9694f4baf585f8f578539252df92876e06d7d8b1"} Apr 17 20:03:40.812069 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.812049 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" event={"ID":"9f323355-87cd-4d74-ba77-22d401a93474","Type":"ContainerStarted","Data":"8c826ef690c1212215ecb368fe4cd28bea26db7f19c3d4d2443c47adf51bf2c0"} Apr 17 20:03:40.813077 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.813055 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6sb8k" event={"ID":"243940a2-412a-4019-b966-f66af5d78985","Type":"ContainerStarted","Data":"ca6fff27eb4b48db578990172a81cde99045ad2e5e69c2ca517d6bf52282fa32"} Apr 17 20:03:40.813930 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.813905 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" event={"ID":"2281133e-34d1-4450-acb3-cbecb7262008","Type":"ContainerStarted","Data":"eb450b4f6ef136983b3e8cc830b7302866e7ca6062c59295f396410b33b3fe8e"} Apr 17 20:03:40.815193 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.815171 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-78p7f" event={"ID":"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1","Type":"ContainerStarted","Data":"2a631deafd6913804f145dbbe765af0286e0a9aea58ce425a8e152f8bfc0ebb3"} Apr 17 20:03:40.816006 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.815988 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ft42n" event={"ID":"fca79bb6-cb8a-4910-8fd0-c7d340049d44","Type":"ContainerStarted","Data":"59fb842efa437003e64494b35413e20fc54228831d6fd5619acfe434f7e7d68b"} Apr 17 20:03:40.816906 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.816880 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cgt9r" event={"ID":"0286d3b0-9e6f-498c-b50c-69d5149b3f0d","Type":"ContainerStarted","Data":"1be03a89a1780badefd1e25c2bbe47dc10d2efeab4e884cf79f60b0cdae1d5b5"} Apr 17 20:03:40.817846 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.817798 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4p8s" event={"ID":"a4667d02-88e0-4ffd-a42f-77c06bdf9c21","Type":"ContainerStarted","Data":"624b5ba80e509dba54ea973228eeccc2e27ff2a5463a63d232f423c71178b66c"} Apr 17 20:03:40.818614 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:40.818593 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-648pj" event={"ID":"b9812077-3cec-42c4-91b6-506bbe029371","Type":"ContainerStarted","Data":"05e870dc44239d42afcf437963a4a549c8ac2e3cd238da7a7a249815f7a76b4b"} Apr 17 20:03:41.337292 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:41.336584 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ksd\" (UniqueName: \"kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd\") pod \"network-check-target-lzj47\" (UID: \"9b5d0119-32c6-4587-994b-0d70198060ea\") " pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:41.337292 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:41.336656 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:41.337292 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:41.336749 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:41.337292 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:41.336894 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:03:41.337292 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:41.336959 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs podName:3ab88728-120f-4d07-91b8-97fe1307e061 nodeName:}" failed. No retries permitted until 2026-04-17 20:03:43.336940236 +0000 UTC m=+6.148355104 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs") pod "network-metrics-daemon-6vrjk" (UID: "3ab88728-120f-4d07-91b8-97fe1307e061") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:03:41.337292 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:41.337074 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:03:41.337292 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:41.337104 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:03:41.337292 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:41.337119 2567 projected.go:194] Error preparing data for projected volume kube-api-access-l4ksd for pod openshift-network-diagnostics/network-check-target-lzj47: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:03:41.337292 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:41.337170 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd podName:9b5d0119-32c6-4587-994b-0d70198060ea nodeName:}" failed. No retries permitted until 2026-04-17 20:03:43.337154797 +0000 UTC m=+6.148569672 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-l4ksd" (UniqueName: "kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd") pod "network-check-target-lzj47" (UID: "9b5d0119-32c6-4587-994b-0d70198060ea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:03:41.337292 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:41.337236 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:03:41.337292 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:41.337270 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret podName:7b977d53-172d-4a66-8807-758a1e1abc45 nodeName:}" failed. No retries permitted until 2026-04-17 20:03:43.337257826 +0000 UTC m=+6.148672702 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret") pod "global-pull-secret-syncer-dq6ll" (UID: "7b977d53-172d-4a66-8807-758a1e1abc45") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:03:41.800541 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:41.799658 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:41.800541 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:41.799810 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:03:41.800541 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:41.800225 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:41.800541 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:41.800332 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:03:41.800541 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:41.800413 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:41.800541 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:41.800489 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:03:41.834475 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:41.834405 2567 generic.go:358] "Generic (PLEG): container finished" podID="d77163023509e27e7eae0d866efd9e46" containerID="0d2eac1f55e6163752bcea83b3e81283633115b5706afc27c60cbb9105d57a18" exitCode=0 Apr 17 20:03:41.835517 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:41.835483 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" event={"ID":"d77163023509e27e7eae0d866efd9e46","Type":"ContainerDied","Data":"0d2eac1f55e6163752bcea83b3e81283633115b5706afc27c60cbb9105d57a18"} Apr 17 20:03:41.849922 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:41.848864 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-158.ec2.internal" podStartSLOduration=3.848843243 podStartE2EDuration="3.848843243s" podCreationTimestamp="2026-04-17 20:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:03:40.822496168 +0000 UTC m=+3.633911053" watchObservedRunningTime="2026-04-17 20:03:41.848843243 +0000 UTC m=+4.660258130" Apr 17 20:03:42.867007 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:42.866275 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" event={"ID":"d77163023509e27e7eae0d866efd9e46","Type":"ContainerStarted","Data":"8d2c7116a0f620af7dd4cc84fc72a432f8aee4bc0dd99c4aecedbff501bc012f"} Apr 17 20:03:43.354392 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:43.354350 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ksd\" (UniqueName: \"kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd\") pod \"network-check-target-lzj47\" (UID: \"9b5d0119-32c6-4587-994b-0d70198060ea\") " pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:43.354582 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:43.354419 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:43.354582 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:43.354453 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:43.354701 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:43.354602 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:03:43.354701 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:43.354668 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs podName:3ab88728-120f-4d07-91b8-97fe1307e061 nodeName:}" failed. No retries permitted until 2026-04-17 20:03:47.354648888 +0000 UTC m=+10.166063757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs") pod "network-metrics-daemon-6vrjk" (UID: "3ab88728-120f-4d07-91b8-97fe1307e061") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:03:43.354880 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:43.354856 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:03:43.354937 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:43.354900 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret podName:7b977d53-172d-4a66-8807-758a1e1abc45 nodeName:}" failed. No retries permitted until 2026-04-17 20:03:47.354887391 +0000 UTC m=+10.166302258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret") pod "global-pull-secret-syncer-dq6ll" (UID: "7b977d53-172d-4a66-8807-758a1e1abc45") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:03:43.354999 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:43.354970 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:03:43.354999 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:43.354984 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:03:43.354999 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:43.354996 2567 projected.go:194] Error preparing data for projected volume kube-api-access-l4ksd for pod openshift-network-diagnostics/network-check-target-lzj47: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:03:43.355138 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:43.355027 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd podName:9b5d0119-32c6-4587-994b-0d70198060ea nodeName:}" failed. No retries permitted until 2026-04-17 20:03:47.355016904 +0000 UTC m=+10.166431782 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-l4ksd" (UniqueName: "kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd") pod "network-check-target-lzj47" (UID: "9b5d0119-32c6-4587-994b-0d70198060ea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:03:43.797460 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:43.797427 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:43.797683 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:43.797643 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:03:43.797683 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:43.797457 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:43.797846 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:43.797754 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:03:43.797846 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:43.797427 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:43.797949 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:43.797904 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:03:45.798152 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:45.798117 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:45.798596 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:45.798252 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:03:45.798670 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:45.798641 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:45.798745 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:45.798727 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:03:45.798842 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:45.798805 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:45.798893 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:45.798875 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:03:47.393259 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:47.393215 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ksd\" (UniqueName: \"kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd\") pod \"network-check-target-lzj47\" (UID: \"9b5d0119-32c6-4587-994b-0d70198060ea\") " pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:47.393825 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:47.393289 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:47.393825 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:47.393330 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:47.393954 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:47.393864 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:03:47.394000 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:47.393970 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret podName:7b977d53-172d-4a66-8807-758a1e1abc45 nodeName:}" failed. No retries permitted until 2026-04-17 20:03:55.393946191 +0000 UTC m=+18.205361067 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret") pod "global-pull-secret-syncer-dq6ll" (UID: "7b977d53-172d-4a66-8807-758a1e1abc45") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:03:47.394134 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:47.394096 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:03:47.394134 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:47.394113 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:03:47.394134 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:47.394128 2567 projected.go:194] Error preparing data for projected volume kube-api-access-l4ksd for pod openshift-network-diagnostics/network-check-target-lzj47: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:03:47.394281 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:47.394182 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd podName:9b5d0119-32c6-4587-994b-0d70198060ea nodeName:}" failed. No retries permitted until 2026-04-17 20:03:55.394169413 +0000 UTC m=+18.205584283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-l4ksd" (UniqueName: "kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd") pod "network-check-target-lzj47" (UID: "9b5d0119-32c6-4587-994b-0d70198060ea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:03:47.394281 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:47.394272 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:03:47.394390 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:47.394318 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs podName:3ab88728-120f-4d07-91b8-97fe1307e061 nodeName:}" failed. No retries permitted until 2026-04-17 20:03:55.394298269 +0000 UTC m=+18.205713135 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs") pod "network-metrics-daemon-6vrjk" (UID: "3ab88728-120f-4d07-91b8-97fe1307e061") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:03:47.798542 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:47.798502 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:47.798718 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:47.798616 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:03:47.799359 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:47.799061 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:47.799359 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:47.799170 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:03:47.799359 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:47.799251 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:47.799359 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:47.799309 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:03:49.797508 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:49.797475 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:49.797992 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:49.797475 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:49.797992 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:49.797597 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:03:49.797992 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:49.797472 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:49.797992 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:49.797735 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:03:49.797992 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:49.797779 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:03:51.797887 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:51.797845 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:51.797887 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:51.797874 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:51.798405 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:51.797979 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:03:51.798405 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:51.798030 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:03:51.798405 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:51.798052 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:51.798405 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:51.798116 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:03:53.797186 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:53.797146 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:53.797654 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:53.797160 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:53.797654 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:53.797282 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:03:53.797654 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:53.797161 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:53.797654 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:53.797375 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:03:53.797654 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:53.797435 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:03:55.452236 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:55.452024 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ksd\" (UniqueName: \"kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd\") pod \"network-check-target-lzj47\" (UID: \"9b5d0119-32c6-4587-994b-0d70198060ea\") " pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:55.452817 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:55.452269 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:55.452817 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:55.452205 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:03:55.452817 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:55.452302 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:55.452817 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:55.452324 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:03:55.452817 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:55.452344 2567 projected.go:194] Error preparing data for projected volume kube-api-access-l4ksd for pod openshift-network-diagnostics/network-check-target-lzj47: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:03:55.452817 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:55.452411 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd podName:9b5d0119-32c6-4587-994b-0d70198060ea nodeName:}" failed. No retries permitted until 2026-04-17 20:04:11.452397032 +0000 UTC m=+34.263811895 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-l4ksd" (UniqueName: "kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd") pod "network-check-target-lzj47" (UID: "9b5d0119-32c6-4587-994b-0d70198060ea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:03:55.452817 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:55.452416 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:03:55.452817 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:55.452428 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:03:55.452817 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:55.452467 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs podName:3ab88728-120f-4d07-91b8-97fe1307e061 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:11.452451311 +0000 UTC m=+34.263866177 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs") pod "network-metrics-daemon-6vrjk" (UID: "3ab88728-120f-4d07-91b8-97fe1307e061") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:03:55.452817 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:55.452484 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret podName:7b977d53-172d-4a66-8807-758a1e1abc45 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:11.452475051 +0000 UTC m=+34.263889917 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret") pod "global-pull-secret-syncer-dq6ll" (UID: "7b977d53-172d-4a66-8807-758a1e1abc45") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:03:55.797477 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:55.797437 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:55.797477 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:55.797467 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:55.797682 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:55.797438 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:55.797682 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:55.797574 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:03:55.797682 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:55.797648 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:03:55.797827 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:55.797752 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:03:57.800391 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:57.800356 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:57.800873 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:57.800362 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:57.800873 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:57.800473 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:03:57.800873 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:57.800362 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:57.800873 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:57.800606 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:03:57.800873 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:57.800714 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:03:58.896453 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.896204 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-78p7f" event={"ID":"3abe62da-aef2-4ef2-85a2-278e4f8fe4c1","Type":"ContainerStarted","Data":"df05760be400b9843ccdc28f54f75fd8d440657e81b6e711a7db9911fec18aa1"} Apr 17 20:03:58.897703 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.897673 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cgt9r" event={"ID":"0286d3b0-9e6f-498c-b50c-69d5149b3f0d","Type":"ContainerStarted","Data":"a4166ccfeea5e337bded8d0dba47491856445df2393157ded200701ed610fb06"} Apr 17 20:03:58.899241 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.899172 2567 generic.go:358] "Generic (PLEG): container finished" podID="a4667d02-88e0-4ffd-a42f-77c06bdf9c21" containerID="578beb45b61a0410d2ee45458bb57145a8921d96069e58fa723440e3736376a6" exitCode=0 Apr 17 20:03:58.899327 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.899272 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4p8s" event={"ID":"a4667d02-88e0-4ffd-a42f-77c06bdf9c21","Type":"ContainerDied","Data":"578beb45b61a0410d2ee45458bb57145a8921d96069e58fa723440e3736376a6"} Apr 17 20:03:58.900836 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.900808 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-648pj" event={"ID":"b9812077-3cec-42c4-91b6-506bbe029371","Type":"ContainerStarted","Data":"fc55df109d3d12969489b1d1e1fc0e4775263623a07d8b0e8cb8932755fa64c7"} Apr 17 20:03:58.904481 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.904455 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m59bq" event={"ID":"153e1d90-2d3e-4083-88f9-781771f16266","Type":"ContainerStarted","Data":"9e6bc63218ab84cad538151fb54deb33fa9b9fea9b0fb58f973ef4ab1b7a881c"} Apr 17 20:03:58.906233 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.906214 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" event={"ID":"9f323355-87cd-4d74-ba77-22d401a93474","Type":"ContainerStarted","Data":"4bcd5dd237d6b89970902c7b3633f9f9f497d2d905c50f610f21da5b8a80766e"} Apr 17 20:03:58.906324 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.906241 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" event={"ID":"9f323355-87cd-4d74-ba77-22d401a93474","Type":"ContainerStarted","Data":"3e28c659c7f33ca1ce75d783dd9053cce6db55f653d9b1e56117bb40a0cc7c65"} Apr 17 20:03:58.906324 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.906254 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" event={"ID":"9f323355-87cd-4d74-ba77-22d401a93474","Type":"ContainerStarted","Data":"25c596858c8d05fd4db9dccb112d4d0e411ff1512cc59a874f733fc161d2325f"} Apr 17 20:03:58.907450 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.907426 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6sb8k" event={"ID":"243940a2-412a-4019-b966-f66af5d78985","Type":"ContainerStarted","Data":"90125d35dfc1246f0bd7cb28dcc37f63156ca5a58dba8a14f4fe02e2f454c4c1"} Apr 17 20:03:58.908625 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.908607 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" event={"ID":"2281133e-34d1-4450-acb3-cbecb7262008","Type":"ContainerStarted","Data":"f09d62cda533751027996386adee351547f4a31f76c4c5e2015082d30d68599f"} Apr 17 20:03:58.909672 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.909638 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-158.ec2.internal" podStartSLOduration=20.909623497 podStartE2EDuration="20.909623497s" podCreationTimestamp="2026-04-17 20:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:03:42.880197405 +0000 UTC m=+5.691612293" watchObservedRunningTime="2026-04-17 20:03:58.909623497 +0000 UTC m=+21.721038383" Apr 17 20:03:58.909779 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.909700 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-78p7f" podStartSLOduration=4.437542549 podStartE2EDuration="21.909696905s" podCreationTimestamp="2026-04-17 20:03:37 +0000 UTC" firstStartedPulling="2026-04-17 20:03:40.530010842 +0000 UTC m=+3.341425705" lastFinishedPulling="2026-04-17 20:03:58.002165189 +0000 UTC m=+20.813580061" observedRunningTime="2026-04-17 20:03:58.90916815 +0000 UTC m=+21.720583036" watchObservedRunningTime="2026-04-17 20:03:58.909696905 +0000 UTC m=+21.721111789" Apr 17 20:03:58.922262 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.922216 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cgt9r" podStartSLOduration=3.516299809 podStartE2EDuration="20.9222026s" podCreationTimestamp="2026-04-17 20:03:38 +0000 UTC" firstStartedPulling="2026-04-17 20:03:40.563922591 +0000 UTC m=+3.375337454" lastFinishedPulling="2026-04-17 20:03:57.969825376 +0000 UTC m=+20.781240245" observedRunningTime="2026-04-17 20:03:58.921784694 +0000 UTC m=+21.733199579" watchObservedRunningTime="2026-04-17 20:03:58.9222026 +0000 UTC m=+21.733617485" Apr 17 20:03:58.953675 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.953615 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6sb8k" podStartSLOduration=9.200314878 podStartE2EDuration="21.95359486s" podCreationTimestamp="2026-04-17 20:03:37 +0000 UTC" firstStartedPulling="2026-04-17 20:03:40.532073318 +0000 UTC m=+3.343488185" lastFinishedPulling="2026-04-17 20:03:53.285353301 +0000 UTC m=+16.096768167" observedRunningTime="2026-04-17 20:03:58.95282132 +0000 UTC m=+21.764236206" watchObservedRunningTime="2026-04-17 20:03:58.95359486 +0000 UTC m=+21.765009751" Apr 17 20:03:58.968464 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.968421 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-m59bq" podStartSLOduration=3.559838752 podStartE2EDuration="20.968407877s" podCreationTimestamp="2026-04-17 20:03:38 +0000 UTC" firstStartedPulling="2026-04-17 20:03:40.561425375 +0000 UTC m=+3.372840238" lastFinishedPulling="2026-04-17 20:03:57.969994486 +0000 UTC m=+20.781409363" observedRunningTime="2026-04-17 20:03:58.967876891 +0000 UTC m=+21.779291767" watchObservedRunningTime="2026-04-17 20:03:58.968407877 +0000 UTC m=+21.779822762" Apr 17 20:03:58.980849 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:58.980787 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-648pj" podStartSLOduration=4.547866616 podStartE2EDuration="21.980752226s" podCreationTimestamp="2026-04-17 20:03:37 +0000 UTC" firstStartedPulling="2026-04-17 20:03:40.538843004 +0000 UTC m=+3.350257867" lastFinishedPulling="2026-04-17 20:03:57.971728599 +0000 UTC m=+20.783143477" observedRunningTime="2026-04-17 20:03:58.980550081 +0000 UTC m=+21.791964978" watchObservedRunningTime="2026-04-17 20:03:58.980752226 +0000 UTC m=+21.792167110" Apr 17 20:03:59.647956 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.647720 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 20:03:59.773588 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.773425 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T20:03:59.647952229Z","UUID":"12e2c3ec-6455-48d2-ad60-433b6cf7086d","Handler":null,"Name":"","Endpoint":""} Apr 17 20:03:59.775269 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.775248 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 20:03:59.775412 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.775278 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 20:03:59.798078 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.798043 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:03:59.798258 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.798093 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:03:59.798258 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.798051 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:03:59.798258 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:59.798205 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:03:59.798422 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:59.798321 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:03:59.798422 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:03:59.798384 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:03:59.913393 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.913348 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" event={"ID":"9f323355-87cd-4d74-ba77-22d401a93474","Type":"ContainerStarted","Data":"b2ef1bc74908e5cb0a28f0512dc828854f8f076ab1802f15d369e746a9f113b5"} Apr 17 20:03:59.913393 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.913386 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" event={"ID":"9f323355-87cd-4d74-ba77-22d401a93474","Type":"ContainerStarted","Data":"1ac869a1b1d2ed2dad05231a73b30ce199bbbbedde10d89638ab9d3814195a2d"} Apr 17 20:03:59.913393 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.913395 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" event={"ID":"9f323355-87cd-4d74-ba77-22d401a93474","Type":"ContainerStarted","Data":"7c481648b6f3fcb9779fbf206dd59b2a33016edadc3c5551ab5298b874295015"} Apr 17 20:03:59.914889 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.914864 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" event={"ID":"2281133e-34d1-4450-acb3-cbecb7262008","Type":"ContainerStarted","Data":"93871e2ad7619f2a9797e88d59de7ba15aa39632bd3b06a8ceb6e703682e73f5"} Apr 17 20:03:59.916053 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.916027 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ft42n" event={"ID":"fca79bb6-cb8a-4910-8fd0-c7d340049d44","Type":"ContainerStarted","Data":"5d6bf28672db7240832dbfffffc4c163c562a73f47fcce489050a301584b58e0"} Apr 17 20:03:59.938078 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:03:59.938030 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-ft42n" podStartSLOduration=4.5317482810000005 podStartE2EDuration="21.938015033s" podCreationTimestamp="2026-04-17 20:03:38 +0000 UTC" firstStartedPulling="2026-04-17 20:03:40.563734549 +0000 UTC m=+3.375149417" lastFinishedPulling="2026-04-17 20:03:57.970001306 +0000 UTC m=+20.781416169" observedRunningTime="2026-04-17 20:03:59.937813492 +0000 UTC m=+22.749228376" watchObservedRunningTime="2026-04-17 20:03:59.938015033 +0000 UTC m=+22.749429918" Apr 17 20:04:00.294342 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:00.294302 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6sb8k" Apr 17 20:04:00.295207 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:00.295178 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6sb8k" Apr 17 20:04:00.920547 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:00.920462 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" event={"ID":"2281133e-34d1-4450-acb3-cbecb7262008","Type":"ContainerStarted","Data":"a0ff2ad3aca357979e6720a3e13fc98175a9320a9add61de1f6837462c4d2c81"} Apr 17 20:04:00.921296 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:00.920829 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6sb8k" Apr 17 20:04:00.921396 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:00.921378 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6sb8k" Apr 17 20:04:00.937422 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:00.937371 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kb8dg" podStartSLOduration=3.053127753 podStartE2EDuration="22.93735775s" podCreationTimestamp="2026-04-17 20:03:38 +0000 UTC" firstStartedPulling="2026-04-17 20:03:40.531193194 +0000 UTC m=+3.342608075" lastFinishedPulling="2026-04-17 20:04:00.415423205 +0000 UTC m=+23.226838072" observedRunningTime="2026-04-17 20:04:00.936943919 +0000 UTC m=+23.748358806" watchObservedRunningTime="2026-04-17 20:04:00.93735775 +0000 UTC m=+23.748772668" Apr 17 20:04:01.797939 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:01.797908 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:04:01.798117 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:01.797908 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:04:01.798117 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:01.798028 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:04:01.798117 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:01.797910 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:04:01.798278 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:01.798082 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:04:01.798278 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:01.798218 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:04:01.925287 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:01.925253 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" event={"ID":"9f323355-87cd-4d74-ba77-22d401a93474","Type":"ContainerStarted","Data":"500d6d2fa6b188411dc137327d6de2418ac118746edf057cda0248e72265bcc2"} Apr 17 20:04:03.797973 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:03.797936 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:04:03.798544 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:03.798006 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:04:03.798544 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:03.798078 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:04:03.798544 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:03.798147 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:04:03.798544 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:03.798196 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:04:03.798544 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:03.798303 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:04:03.933939 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:03.933511 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" event={"ID":"9f323355-87cd-4d74-ba77-22d401a93474","Type":"ContainerStarted","Data":"7aac20e4c575693c4e501fc93cbd3e63a7d5ae0fc855cfdbdd5718a532776206"} Apr 17 20:04:03.933939 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:03.933801 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:04:03.948183 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:03.948155 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:04:03.962436 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:03.962381 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" podStartSLOduration=7.848046724 podStartE2EDuration="25.962365498s" podCreationTimestamp="2026-04-17 20:03:38 +0000 UTC" firstStartedPulling="2026-04-17 20:03:40.535848734 +0000 UTC m=+3.347263597" lastFinishedPulling="2026-04-17 20:03:58.650167508 +0000 UTC m=+21.461582371" observedRunningTime="2026-04-17 20:04:03.962037731 +0000 UTC m=+26.773452617" watchObservedRunningTime="2026-04-17 20:04:03.962365498 +0000 UTC m=+26.773780382" Apr 17 20:04:04.937383 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:04.937346 2567 generic.go:358] "Generic (PLEG): container finished" podID="a4667d02-88e0-4ffd-a42f-77c06bdf9c21" containerID="16bb4fe01bc24626f13bb0ce29aaa3b087b49f05de309fa57bbc6921a4dc5568" exitCode=0 Apr 17 20:04:04.937933 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:04.937434 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4p8s" event={"ID":"a4667d02-88e0-4ffd-a42f-77c06bdf9c21","Type":"ContainerDied","Data":"16bb4fe01bc24626f13bb0ce29aaa3b087b49f05de309fa57bbc6921a4dc5568"} Apr 17 20:04:04.937933 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:04.937637 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:04:04.937933 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:04.937657 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:04:04.952345 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:04.952321 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:04:05.797788 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:05.797719 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:04:05.797985 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:05.797860 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:04:05.798286 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:05.798259 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:04:05.798463 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:05.798413 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:04:05.798463 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:05.798452 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:04:05.798597 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:05.798542 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:04:05.801214 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:05.801184 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dq6ll"] Apr 17 20:04:05.803871 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:05.803843 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6vrjk"] Apr 17 20:04:05.804344 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:05.804327 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lzj47"] Apr 17 20:04:05.940819 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:05.940788 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:04:05.941161 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:05.940788 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:04:05.941161 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:05.940788 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:04:05.941161 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:05.940915 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:04:05.941460 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:05.941428 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:04:05.941559 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:05.941492 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:04:06.944305 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:06.944265 2567 generic.go:358] "Generic (PLEG): container finished" podID="a4667d02-88e0-4ffd-a42f-77c06bdf9c21" containerID="d361b980c2a0d6fbce1f280b8526e1f5063fdd362997926ab5ac4233630cbf30" exitCode=0 Apr 17 20:04:06.944758 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:06.944354 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4p8s" event={"ID":"a4667d02-88e0-4ffd-a42f-77c06bdf9c21","Type":"ContainerDied","Data":"d361b980c2a0d6fbce1f280b8526e1f5063fdd362997926ab5ac4233630cbf30"} Apr 17 20:04:07.798336 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:07.798306 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:04:07.798336 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:07.798305 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:04:07.799404 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:07.798441 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:04:07.799404 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:07.798487 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:04:07.799404 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:07.798321 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:04:07.799404 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:07.798587 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:04:08.951254 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:08.951218 2567 generic.go:358] "Generic (PLEG): container finished" podID="a4667d02-88e0-4ffd-a42f-77c06bdf9c21" containerID="699794a5d4cd8116b0f89882e18ad97ec5a1235576e768292eec6297ced6fc6c" exitCode=0 Apr 17 20:04:08.952215 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:08.951283 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4p8s" event={"ID":"a4667d02-88e0-4ffd-a42f-77c06bdf9c21","Type":"ContainerDied","Data":"699794a5d4cd8116b0f89882e18ad97ec5a1235576e768292eec6297ced6fc6c"} Apr 17 20:04:09.797872 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:09.797837 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:04:09.797872 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:09.797859 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:04:09.798120 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:09.797858 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:04:09.798120 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:09.797964 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lzj47" podUID="9b5d0119-32c6-4587-994b-0d70198060ea" Apr 17 20:04:09.798226 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:09.798165 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dq6ll" podUID="7b977d53-172d-4a66-8807-758a1e1abc45" Apr 17 20:04:09.798298 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:09.798277 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:04:10.949083 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:10.948999 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-158.ec2.internal" event="NodeReady" Apr 17 20:04:10.949540 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:10.949156 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 20:04:10.981700 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:10.981667 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv"] Apr 17 20:04:10.999784 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:10.999724 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4"] Apr 17 20:04:10.999950 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:10.999899 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:11.003567 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.003540 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 20:04:11.006625 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.006594 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 20:04:11.006788 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.006651 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 20:04:11.006788 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.006696 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 20:04:11.018378 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.018329 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk"] Apr 17 20:04:11.018592 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.018565 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.021150 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.021122 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 20:04:11.021903 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.021877 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 20:04:11.022026 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.021989 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 20:04:11.022026 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.021989 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 20:04:11.030024 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.029999 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-57566cfc79-8bxjf"] Apr 17 20:04:11.030204 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.030185 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" Apr 17 20:04:11.032521 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.032497 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-6sz22\"" Apr 17 20:04:11.032640 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.032608 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 20:04:11.046649 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.046584 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zmjhn"] Apr 17 20:04:11.046841 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.046752 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.049175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.049154 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 20:04:11.049432 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.049159 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 20:04:11.049432 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.049202 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 20:04:11.049594 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.049159 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rkflr\"" Apr 17 20:04:11.053814 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.053790 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 20:04:11.061707 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.061681 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv"] Apr 17 20:04:11.061853 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.061722 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-755zw"] Apr 17 20:04:11.061853 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.061758 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:04:11.064445 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.064414 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tl4j2\"" Apr 17 20:04:11.064445 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.064433 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 20:04:11.064646 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.064625 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 20:04:11.064798 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.064690 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 20:04:11.073132 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.073093 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/206e65d1-c12e-41a7-88b3-b06e61138b95-klusterlet-config\") pod \"klusterlet-addon-workmgr-68ddbbc475-ntvzv\" (UID: \"206e65d1-c12e-41a7-88b3-b06e61138b95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:11.073327 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.073157 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/206e65d1-c12e-41a7-88b3-b06e61138b95-tmp\") pod \"klusterlet-addon-workmgr-68ddbbc475-ntvzv\" (UID: \"206e65d1-c12e-41a7-88b3-b06e61138b95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:11.073327 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.073185 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d87v9\" (UniqueName: \"kubernetes.io/projected/206e65d1-c12e-41a7-88b3-b06e61138b95-kube-api-access-d87v9\") pod \"klusterlet-addon-workmgr-68ddbbc475-ntvzv\" (UID: \"206e65d1-c12e-41a7-88b3-b06e61138b95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:11.078072 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.078050 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk"] Apr 17 20:04:11.078072 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.078073 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4"] Apr 17 20:04:11.078215 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.078081 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57566cfc79-8bxjf"] Apr 17 20:04:11.078215 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.078089 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-755zw"] Apr 17 20:04:11.078215 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.078096 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zmjhn"] Apr 17 20:04:11.078215 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.078200 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.080371 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.080351 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 20:04:11.080577 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.080560 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4cbzc\"" Apr 17 20:04:11.080577 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.080574 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 20:04:11.174246 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174197 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/206e65d1-c12e-41a7-88b3-b06e61138b95-klusterlet-config\") pod \"klusterlet-addon-workmgr-68ddbbc475-ntvzv\" (UID: \"206e65d1-c12e-41a7-88b3-b06e61138b95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:11.174435 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174350 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ff6857d-6533-480b-ba95-f01666563ed0-config-volume\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.174435 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174383 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/42d25788-0fa0-415b-814b-07fd37617909-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.174435 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174420 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.174583 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174448 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/42d25788-0fa0-415b-814b-07fd37617909-ca\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.174583 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174500 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/42d25788-0fa0-415b-814b-07fd37617909-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.174583 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174559 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-image-registry-private-configuration\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.174728 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174586 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.174728 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174674 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/206e65d1-c12e-41a7-88b3-b06e61138b95-tmp\") pod \"klusterlet-addon-workmgr-68ddbbc475-ntvzv\" (UID: \"206e65d1-c12e-41a7-88b3-b06e61138b95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:11.174870 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174736 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d87v9\" (UniqueName: \"kubernetes.io/projected/206e65d1-c12e-41a7-88b3-b06e61138b95-kube-api-access-d87v9\") pod \"klusterlet-addon-workmgr-68ddbbc475-ntvzv\" (UID: \"206e65d1-c12e-41a7-88b3-b06e61138b95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:11.174870 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174829 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ff6857d-6533-480b-ba95-f01666563ed0-tmp-dir\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.174870 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174862 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm7wm\" (UniqueName: \"kubernetes.io/projected/4ff6857d-6533-480b-ba95-f01666563ed0-kube-api-access-zm7wm\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.175003 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174915 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-999xq\" (UniqueName: \"kubernetes.io/projected/42d25788-0fa0-415b-814b-07fd37617909-kube-api-access-999xq\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.175070 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.174997 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sdwm\" (UniqueName: \"kubernetes.io/projected/e856013c-9b3d-4e4a-9498-4fcefde1b527-kube-api-access-8sdwm\") pod \"managed-serviceaccount-addon-agent-c855cb67f-66tfk\" (UID: \"e856013c-9b3d-4e4a-9498-4fcefde1b527\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" Apr 17 20:04:11.175070 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.175036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-ca-trust-extracted\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.175188 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.175069 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-bound-sa-token\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.175188 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.175117 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:04:11.175188 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.175144 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz54m\" (UniqueName: \"kubernetes.io/projected/84ab49f7-96ea-40fb-b996-8b5492b23d01-kube-api-access-lz54m\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:04:11.175188 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.175164 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/206e65d1-c12e-41a7-88b3-b06e61138b95-tmp\") pod \"klusterlet-addon-workmgr-68ddbbc475-ntvzv\" (UID: \"206e65d1-c12e-41a7-88b3-b06e61138b95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:11.175334 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.175187 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-certificates\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.175334 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.175226 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-installation-pull-secrets\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.175334 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.175266 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-trusted-ca\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.175334 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.175299 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sxql\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-kube-api-access-7sxql\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.175334 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.175318 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/42d25788-0fa0-415b-814b-07fd37617909-hub\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.175533 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.175335 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/42d25788-0fa0-415b-814b-07fd37617909-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.175533 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.175378 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e856013c-9b3d-4e4a-9498-4fcefde1b527-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-c855cb67f-66tfk\" (UID: \"e856013c-9b3d-4e4a-9498-4fcefde1b527\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" Apr 17 20:04:11.179389 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.179365 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/206e65d1-c12e-41a7-88b3-b06e61138b95-klusterlet-config\") pod \"klusterlet-addon-workmgr-68ddbbc475-ntvzv\" (UID: \"206e65d1-c12e-41a7-88b3-b06e61138b95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:11.186073 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.186047 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d87v9\" (UniqueName: \"kubernetes.io/projected/206e65d1-c12e-41a7-88b3-b06e61138b95-kube-api-access-d87v9\") pod \"klusterlet-addon-workmgr-68ddbbc475-ntvzv\" (UID: \"206e65d1-c12e-41a7-88b3-b06e61138b95\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:11.276095 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276053 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ff6857d-6533-480b-ba95-f01666563ed0-config-volume\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.276095 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276099 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/42d25788-0fa0-415b-814b-07fd37617909-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.276350 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276236 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.276350 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276297 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/42d25788-0fa0-415b-814b-07fd37617909-ca\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.276350 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276332 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/42d25788-0fa0-415b-814b-07fd37617909-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.276508 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276360 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-image-registry-private-configuration\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.276508 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.276375 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:11.276508 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276389 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.276508 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276439 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ff6857d-6533-480b-ba95-f01666563ed0-tmp-dir\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.276508 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.276463 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls podName:4ff6857d-6533-480b-ba95-f01666563ed0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:11.776440661 +0000 UTC m=+34.587855529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls") pod "dns-default-755zw" (UID: "4ff6857d-6533-480b-ba95-f01666563ed0") : secret "dns-default-metrics-tls" not found Apr 17 20:04:11.276508 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276502 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zm7wm\" (UniqueName: \"kubernetes.io/projected/4ff6857d-6533-480b-ba95-f01666563ed0-kube-api-access-zm7wm\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.276825 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276534 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-999xq\" (UniqueName: \"kubernetes.io/projected/42d25788-0fa0-415b-814b-07fd37617909-kube-api-access-999xq\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.276825 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276571 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8sdwm\" (UniqueName: \"kubernetes.io/projected/e856013c-9b3d-4e4a-9498-4fcefde1b527-kube-api-access-8sdwm\") pod \"managed-serviceaccount-addon-agent-c855cb67f-66tfk\" (UID: \"e856013c-9b3d-4e4a-9498-4fcefde1b527\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" Apr 17 20:04:11.276825 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276598 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-ca-trust-extracted\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.276825 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276626 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-bound-sa-token\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.276825 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276669 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:04:11.276825 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276691 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lz54m\" (UniqueName: \"kubernetes.io/projected/84ab49f7-96ea-40fb-b996-8b5492b23d01-kube-api-access-lz54m\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:04:11.276825 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276698 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ff6857d-6533-480b-ba95-f01666563ed0-tmp-dir\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.276825 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276710 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-certificates\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.276825 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276729 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-installation-pull-secrets\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.276825 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276803 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-trusted-ca\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.277302 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276836 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sxql\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-kube-api-access-7sxql\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.277302 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276834 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ff6857d-6533-480b-ba95-f01666563ed0-config-volume\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.277302 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276854 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/42d25788-0fa0-415b-814b-07fd37617909-hub\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.277302 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276898 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/42d25788-0fa0-415b-814b-07fd37617909-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.277302 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.276966 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e856013c-9b3d-4e4a-9498-4fcefde1b527-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-c855cb67f-66tfk\" (UID: \"e856013c-9b3d-4e4a-9498-4fcefde1b527\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" Apr 17 20:04:11.277302 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.277240 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-ca-trust-extracted\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.277800 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.277747 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/42d25788-0fa0-415b-814b-07fd37617909-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.279362 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.278477 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-certificates\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.279362 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.278793 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:11.279362 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.278859 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert podName:84ab49f7-96ea-40fb-b996-8b5492b23d01 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:11.778840423 +0000 UTC m=+34.590255288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert") pod "ingress-canary-zmjhn" (UID: "84ab49f7-96ea-40fb-b996-8b5492b23d01") : secret "canary-serving-cert" not found Apr 17 20:04:11.279362 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.278908 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-trusted-ca\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.279362 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.279225 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/42d25788-0fa0-415b-814b-07fd37617909-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.279362 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.279276 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:04:11.279362 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.279293 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57566cfc79-8bxjf: secret "image-registry-tls" not found Apr 17 20:04:11.279362 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.279337 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls podName:2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:11.779321145 +0000 UTC m=+34.590736028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls") pod "image-registry-57566cfc79-8bxjf" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4") : secret "image-registry-tls" not found Apr 17 20:04:11.280172 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.280147 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e856013c-9b3d-4e4a-9498-4fcefde1b527-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-c855cb67f-66tfk\" (UID: \"e856013c-9b3d-4e4a-9498-4fcefde1b527\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" Apr 17 20:04:11.280318 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.280301 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-image-registry-private-configuration\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.281452 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.281430 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-installation-pull-secrets\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.286949 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.286316 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/42d25788-0fa0-415b-814b-07fd37617909-ca\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.286949 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.286885 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sdwm\" (UniqueName: \"kubernetes.io/projected/e856013c-9b3d-4e4a-9498-4fcefde1b527-kube-api-access-8sdwm\") pod \"managed-serviceaccount-addon-agent-c855cb67f-66tfk\" (UID: \"e856013c-9b3d-4e4a-9498-4fcefde1b527\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" Apr 17 20:04:11.287128 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.287090 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/42d25788-0fa0-415b-814b-07fd37617909-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.287578 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.287533 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/42d25788-0fa0-415b-814b-07fd37617909-hub\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.287578 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.287566 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm7wm\" (UniqueName: \"kubernetes.io/projected/4ff6857d-6533-480b-ba95-f01666563ed0-kube-api-access-zm7wm\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.288744 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.288659 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sxql\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-kube-api-access-7sxql\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.288744 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.288702 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz54m\" (UniqueName: \"kubernetes.io/projected/84ab49f7-96ea-40fb-b996-8b5492b23d01-kube-api-access-lz54m\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:04:11.289574 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.289550 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-bound-sa-token\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.289723 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.289708 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-999xq\" (UniqueName: \"kubernetes.io/projected/42d25788-0fa0-415b-814b-07fd37617909-kube-api-access-999xq\") pod \"cluster-proxy-proxy-agent-f87669895-79jr4\" (UID: \"42d25788-0fa0-415b-814b-07fd37617909\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.312759 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.312727 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:11.344657 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.344623 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:04:11.357149 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.357127 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" Apr 17 20:04:11.479734 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.479445 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv"] Apr 17 20:04:11.479992 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.479967 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ksd\" (UniqueName: \"kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd\") pod \"network-check-target-lzj47\" (UID: \"9b5d0119-32c6-4587-994b-0d70198060ea\") " pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:04:11.480102 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.480057 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:04:11.480102 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.480096 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:04:11.480739 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.480247 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:11.480739 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.480348 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:11.480739 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.480370 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:11.480739 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.480379 2567 projected.go:194] Error preparing data for projected volume kube-api-access-l4ksd for pod openshift-network-diagnostics/network-check-target-lzj47: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:11.480739 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.480425 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd podName:9b5d0119-32c6-4587-994b-0d70198060ea nodeName:}" failed. No retries permitted until 2026-04-17 20:04:43.48040827 +0000 UTC m=+66.291823147 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-l4ksd" (UniqueName: "kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd") pod "network-check-target-lzj47" (UID: "9b5d0119-32c6-4587-994b-0d70198060ea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:11.480739 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.480507 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs podName:3ab88728-120f-4d07-91b8-97fe1307e061 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:43.480486312 +0000 UTC m=+66.291901192 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs") pod "network-metrics-daemon-6vrjk" (UID: "3ab88728-120f-4d07-91b8-97fe1307e061") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:11.480739 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.480596 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:11.480739 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.480680 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret podName:7b977d53-172d-4a66-8807-758a1e1abc45 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:43.480667464 +0000 UTC m=+66.292082345 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret") pod "global-pull-secret-syncer-dq6ll" (UID: "7b977d53-172d-4a66-8807-758a1e1abc45") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:11.485814 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:04:11.485781 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod206e65d1_c12e_41a7_88b3_b06e61138b95.slice/crio-021138e54014b8edb92ce82243147ee5b95b9ad2a29fb0e5ba7af04de2fee5f8 WatchSource:0}: Error finding container 021138e54014b8edb92ce82243147ee5b95b9ad2a29fb0e5ba7af04de2fee5f8: Status 404 returned error can't find the container with id 021138e54014b8edb92ce82243147ee5b95b9ad2a29fb0e5ba7af04de2fee5f8 Apr 17 20:04:11.500802 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.500748 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk"] Apr 17 20:04:11.503816 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:04:11.503788 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode856013c_9b3d_4e4a_9498_4fcefde1b527.slice/crio-c7f59c6ac12bdb9d7967821798e7a06e76ae7c6e17ceb3dcbf60a771db187397 WatchSource:0}: Error finding container c7f59c6ac12bdb9d7967821798e7a06e76ae7c6e17ceb3dcbf60a771db187397: Status 404 returned error can't find the container with id c7f59c6ac12bdb9d7967821798e7a06e76ae7c6e17ceb3dcbf60a771db187397 Apr 17 20:04:11.508374 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.508346 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4"] Apr 17 20:04:11.519645 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:04:11.519609 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42d25788_0fa0_415b_814b_07fd37617909.slice/crio-f99a86a959db9f2a71af17335f2f39ca234b78aee98815c3cfc82bcb7d7e7f27 WatchSource:0}: Error finding container f99a86a959db9f2a71af17335f2f39ca234b78aee98815c3cfc82bcb7d7e7f27: Status 404 returned error can't find the container with id f99a86a959db9f2a71af17335f2f39ca234b78aee98815c3cfc82bcb7d7e7f27 Apr 17 20:04:11.783000 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.782961 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:11.783000 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.783001 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:11.783297 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.783044 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:04:11.783297 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.783126 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:11.783297 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.783133 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:11.783297 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.783174 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert podName:84ab49f7-96ea-40fb-b996-8b5492b23d01 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:12.783160514 +0000 UTC m=+35.594575377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert") pod "ingress-canary-zmjhn" (UID: "84ab49f7-96ea-40fb-b996-8b5492b23d01") : secret "canary-serving-cert" not found Apr 17 20:04:11.783297 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.783203 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls podName:4ff6857d-6533-480b-ba95-f01666563ed0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:12.783184835 +0000 UTC m=+35.594599920 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls") pod "dns-default-755zw" (UID: "4ff6857d-6533-480b-ba95-f01666563ed0") : secret "dns-default-metrics-tls" not found Apr 17 20:04:11.783297 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.783132 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:04:11.783297 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.783222 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57566cfc79-8bxjf: secret "image-registry-tls" not found Apr 17 20:04:11.783297 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:11.783261 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls podName:2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:12.783249088 +0000 UTC m=+35.594663955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls") pod "image-registry-57566cfc79-8bxjf" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4") : secret "image-registry-tls" not found Apr 17 20:04:11.797191 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.797152 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:04:11.797320 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.797258 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:04:11.797320 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.797290 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:04:11.799838 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.799734 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:04:11.799838 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.799740 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8c6pz\"" Apr 17 20:04:11.799838 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.799788 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x8jh2\"" Apr 17 20:04:11.799838 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.799818 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:04:11.800155 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.799745 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 20:04:11.800155 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.799793 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:04:11.958681 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.958640 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" event={"ID":"42d25788-0fa0-415b-814b-07fd37617909","Type":"ContainerStarted","Data":"f99a86a959db9f2a71af17335f2f39ca234b78aee98815c3cfc82bcb7d7e7f27"} Apr 17 20:04:11.960182 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.960151 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" event={"ID":"206e65d1-c12e-41a7-88b3-b06e61138b95","Type":"ContainerStarted","Data":"021138e54014b8edb92ce82243147ee5b95b9ad2a29fb0e5ba7af04de2fee5f8"} Apr 17 20:04:11.961273 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:11.961247 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" event={"ID":"e856013c-9b3d-4e4a-9498-4fcefde1b527","Type":"ContainerStarted","Data":"c7f59c6ac12bdb9d7967821798e7a06e76ae7c6e17ceb3dcbf60a771db187397"} Apr 17 20:04:12.793005 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:12.792175 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:12.793005 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:12.792230 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:12.793005 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:12.792293 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:04:12.793005 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:12.792420 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:12.793005 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:12.792486 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert podName:84ab49f7-96ea-40fb-b996-8b5492b23d01 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:14.79246491 +0000 UTC m=+37.603879795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert") pod "ingress-canary-zmjhn" (UID: "84ab49f7-96ea-40fb-b996-8b5492b23d01") : secret "canary-serving-cert" not found Apr 17 20:04:12.793005 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:12.792922 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:12.793005 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:12.792978 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls podName:4ff6857d-6533-480b-ba95-f01666563ed0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:14.79296206 +0000 UTC m=+37.604376936 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls") pod "dns-default-755zw" (UID: "4ff6857d-6533-480b-ba95-f01666563ed0") : secret "dns-default-metrics-tls" not found Apr 17 20:04:12.793005 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:12.792984 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:04:12.793005 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:12.793002 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57566cfc79-8bxjf: secret "image-registry-tls" not found Apr 17 20:04:12.793714 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:12.793046 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls podName:2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:14.793033366 +0000 UTC m=+37.604448243 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls") pod "image-registry-57566cfc79-8bxjf" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4") : secret "image-registry-tls" not found Apr 17 20:04:14.811717 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:14.811674 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:14.812312 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:14.811729 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:14.812312 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:14.811813 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:04:14.812312 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:14.811955 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:14.812312 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:14.812019 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert podName:84ab49f7-96ea-40fb-b996-8b5492b23d01 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:18.811999418 +0000 UTC m=+41.623414299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert") pod "ingress-canary-zmjhn" (UID: "84ab49f7-96ea-40fb-b996-8b5492b23d01") : secret "canary-serving-cert" not found Apr 17 20:04:14.812523 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:14.812438 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:14.812523 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:14.812488 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls podName:4ff6857d-6533-480b-ba95-f01666563ed0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:18.812472622 +0000 UTC m=+41.623887492 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls") pod "dns-default-755zw" (UID: "4ff6857d-6533-480b-ba95-f01666563ed0") : secret "dns-default-metrics-tls" not found Apr 17 20:04:14.812629 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:14.812548 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:04:14.812629 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:14.812559 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57566cfc79-8bxjf: secret "image-registry-tls" not found Apr 17 20:04:14.812629 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:14.812589 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls podName:2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:18.812578945 +0000 UTC m=+41.623993810 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls") pod "image-registry-57566cfc79-8bxjf" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4") : secret "image-registry-tls" not found Apr 17 20:04:18.852571 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:18.852522 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:18.853067 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:18.852582 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:18.853067 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:18.852654 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:04:18.853067 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:18.852691 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:04:18.853067 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:18.852711 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57566cfc79-8bxjf: secret "image-registry-tls" not found Apr 17 20:04:18.853067 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:18.852745 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:18.853067 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:18.852692 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:18.853067 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:18.852798 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls podName:2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:26.852755225 +0000 UTC m=+49.664170108 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls") pod "image-registry-57566cfc79-8bxjf" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4") : secret "image-registry-tls" not found Apr 17 20:04:18.853067 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:18.852816 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert podName:84ab49f7-96ea-40fb-b996-8b5492b23d01 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:26.852807136 +0000 UTC m=+49.664222010 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert") pod "ingress-canary-zmjhn" (UID: "84ab49f7-96ea-40fb-b996-8b5492b23d01") : secret "canary-serving-cert" not found Apr 17 20:04:18.853067 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:18.852831 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls podName:4ff6857d-6533-480b-ba95-f01666563ed0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:26.852823058 +0000 UTC m=+49.664237927 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls") pod "dns-default-755zw" (UID: "4ff6857d-6533-480b-ba95-f01666563ed0") : secret "dns-default-metrics-tls" not found Apr 17 20:04:20.981497 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:20.981464 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" event={"ID":"42d25788-0fa0-415b-814b-07fd37617909","Type":"ContainerStarted","Data":"57128e5fff91b78692c27196dad01fd4e9c86abc0ced4462e87794ed41a1c03c"} Apr 17 20:04:20.982705 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:20.982680 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" event={"ID":"206e65d1-c12e-41a7-88b3-b06e61138b95","Type":"ContainerStarted","Data":"b055d92c884ff3bb655648180c681d9ad08e819db44774a96b303de40aabca35"} Apr 17 20:04:20.982893 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:20.982879 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:20.984708 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:20.984686 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:04:20.985369 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:20.985347 2567 generic.go:358] "Generic (PLEG): container finished" podID="a4667d02-88e0-4ffd-a42f-77c06bdf9c21" containerID="b0b8e82d35b868fd894032c1957ed2c83c6ae95372439e36faf2201a8f66838b" exitCode=0 Apr 17 20:04:20.985469 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:20.985402 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4p8s" event={"ID":"a4667d02-88e0-4ffd-a42f-77c06bdf9c21","Type":"ContainerDied","Data":"b0b8e82d35b868fd894032c1957ed2c83c6ae95372439e36faf2201a8f66838b"} Apr 17 20:04:20.986962 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:20.986941 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" event={"ID":"e856013c-9b3d-4e4a-9498-4fcefde1b527","Type":"ContainerStarted","Data":"c900d7a0f8dde4057a9f125c098b52c9d3bc73e7cb0ee8ed6d8c1eb8cae60d5e"} Apr 17 20:04:20.999259 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:20.999211 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" podStartSLOduration=19.879239341999998 podStartE2EDuration="28.999199019s" podCreationTimestamp="2026-04-17 20:03:52 +0000 UTC" firstStartedPulling="2026-04-17 20:04:11.488063072 +0000 UTC m=+34.299477942" lastFinishedPulling="2026-04-17 20:04:20.608022756 +0000 UTC m=+43.419437619" observedRunningTime="2026-04-17 20:04:20.998431247 +0000 UTC m=+43.809846132" watchObservedRunningTime="2026-04-17 20:04:20.999199019 +0000 UTC m=+43.810613904" Apr 17 20:04:21.049959 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:21.049900 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" podStartSLOduration=19.963454365 podStartE2EDuration="29.049879883s" podCreationTimestamp="2026-04-17 20:03:52 +0000 UTC" firstStartedPulling="2026-04-17 20:04:11.505892607 +0000 UTC m=+34.317307473" lastFinishedPulling="2026-04-17 20:04:20.592318124 +0000 UTC m=+43.403732991" observedRunningTime="2026-04-17 20:04:21.049472793 +0000 UTC m=+43.860887679" watchObservedRunningTime="2026-04-17 20:04:21.049879883 +0000 UTC m=+43.861294768" Apr 17 20:04:21.994347 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:21.994310 2567 generic.go:358] "Generic (PLEG): container finished" podID="a4667d02-88e0-4ffd-a42f-77c06bdf9c21" containerID="0c1992ae3be25b596c1bf034f4a8f45bbb055e789c26c1223d10b31de4b6c873" exitCode=0 Apr 17 20:04:21.994787 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:21.994392 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4p8s" event={"ID":"a4667d02-88e0-4ffd-a42f-77c06bdf9c21","Type":"ContainerDied","Data":"0c1992ae3be25b596c1bf034f4a8f45bbb055e789c26c1223d10b31de4b6c873"} Apr 17 20:04:22.999647 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:22.999613 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q4p8s" event={"ID":"a4667d02-88e0-4ffd-a42f-77c06bdf9c21","Type":"ContainerStarted","Data":"dc71903f61563ceb046b93e5c01aacf7453b95762e83d3d34bcf043b3597d68e"} Apr 17 20:04:23.022184 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:23.021961 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-q4p8s" podStartSLOduration=4.99231919 podStartE2EDuration="45.021941047s" podCreationTimestamp="2026-04-17 20:03:38 +0000 UTC" firstStartedPulling="2026-04-17 20:03:40.563844549 +0000 UTC m=+3.375259421" lastFinishedPulling="2026-04-17 20:04:20.5934664 +0000 UTC m=+43.404881278" observedRunningTime="2026-04-17 20:04:23.021202649 +0000 UTC m=+45.832617544" watchObservedRunningTime="2026-04-17 20:04:23.021941047 +0000 UTC m=+45.833355933" Apr 17 20:04:25.005568 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:25.005531 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" event={"ID":"42d25788-0fa0-415b-814b-07fd37617909","Type":"ContainerStarted","Data":"4888e72356e028c1b39a279499977922dfb0a2773c7812dda9b26b1a1bbb17af"} Apr 17 20:04:25.005568 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:25.005568 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" event={"ID":"42d25788-0fa0-415b-814b-07fd37617909","Type":"ContainerStarted","Data":"e0ed278e053596e2c81afae732b2e00ca83a05154569f47ceb701aa15de38235"} Apr 17 20:04:25.024830 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:25.024758 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" podStartSLOduration=20.589024555 podStartE2EDuration="33.0247447s" podCreationTimestamp="2026-04-17 20:03:52 +0000 UTC" firstStartedPulling="2026-04-17 20:04:11.521956982 +0000 UTC m=+34.333371845" lastFinishedPulling="2026-04-17 20:04:23.957677123 +0000 UTC m=+46.769091990" observedRunningTime="2026-04-17 20:04:25.024310082 +0000 UTC m=+47.835724967" watchObservedRunningTime="2026-04-17 20:04:25.0247447 +0000 UTC m=+47.836159585" Apr 17 20:04:26.917329 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:26.917281 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:26.917329 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:26.917332 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:26.917988 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:26.917382 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:04:26.917988 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:26.917454 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:26.917988 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:26.917474 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:26.917988 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:26.917496 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:04:26.917988 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:26.917514 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57566cfc79-8bxjf: secret "image-registry-tls" not found Apr 17 20:04:26.917988 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:26.917539 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls podName:4ff6857d-6533-480b-ba95-f01666563ed0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:42.917518458 +0000 UTC m=+65.728933321 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls") pod "dns-default-755zw" (UID: "4ff6857d-6533-480b-ba95-f01666563ed0") : secret "dns-default-metrics-tls" not found Apr 17 20:04:26.917988 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:26.917560 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls podName:2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:42.917548285 +0000 UTC m=+65.728963152 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls") pod "image-registry-57566cfc79-8bxjf" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4") : secret "image-registry-tls" not found Apr 17 20:04:26.917988 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:26.917575 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert podName:84ab49f7-96ea-40fb-b996-8b5492b23d01 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:42.917568239 +0000 UTC m=+65.728983101 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert") pod "ingress-canary-zmjhn" (UID: "84ab49f7-96ea-40fb-b996-8b5492b23d01") : secret "canary-serving-cert" not found Apr 17 20:04:36.955589 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:36.955560 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ml4d4" Apr 17 20:04:42.945900 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:42.945861 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:04:42.946291 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:42.945923 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:04:42.946291 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:42.945945 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:04:42.946291 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:42.946025 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:04:42.946291 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:42.946033 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:42.946291 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:42.946101 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert podName:84ab49f7-96ea-40fb-b996-8b5492b23d01 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:14.946087315 +0000 UTC m=+97.757502178 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert") pod "ingress-canary-zmjhn" (UID: "84ab49f7-96ea-40fb-b996-8b5492b23d01") : secret "canary-serving-cert" not found Apr 17 20:04:42.946291 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:42.946036 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57566cfc79-8bxjf: secret "image-registry-tls" not found Apr 17 20:04:42.946291 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:42.946035 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:42.946291 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:42.946205 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls podName:2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:14.94618804 +0000 UTC m=+97.757602933 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls") pod "image-registry-57566cfc79-8bxjf" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4") : secret "image-registry-tls" not found Apr 17 20:04:42.946291 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:42.946227 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls podName:4ff6857d-6533-480b-ba95-f01666563ed0 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:14.946214755 +0000 UTC m=+97.757629618 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls") pod "dns-default-755zw" (UID: "4ff6857d-6533-480b-ba95-f01666563ed0") : secret "dns-default-metrics-tls" not found Apr 17 20:04:43.550002 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.549966 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ksd\" (UniqueName: \"kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd\") pod \"network-check-target-lzj47\" (UID: \"9b5d0119-32c6-4587-994b-0d70198060ea\") " pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:04:43.550184 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.550026 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:04:43.550184 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.550049 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:04:43.552793 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.552755 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:04:43.552938 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.552921 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 20:04:43.552984 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.552971 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:04:43.561062 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:43.561041 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:04:43.561143 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:04:43.561108 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs podName:3ab88728-120f-4d07-91b8-97fe1307e061 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:47.561088723 +0000 UTC m=+130.372503892 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs") pod "network-metrics-daemon-6vrjk" (UID: "3ab88728-120f-4d07-91b8-97fe1307e061") : secret "metrics-daemon-secret" not found Apr 17 20:04:43.563359 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.563342 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:04:43.564305 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.564287 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b977d53-172d-4a66-8807-758a1e1abc45-original-pull-secret\") pod \"global-pull-secret-syncer-dq6ll\" (UID: \"7b977d53-172d-4a66-8807-758a1e1abc45\") " pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:04:43.574110 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.574086 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4ksd\" (UniqueName: \"kubernetes.io/projected/9b5d0119-32c6-4587-994b-0d70198060ea-kube-api-access-l4ksd\") pod \"network-check-target-lzj47\" (UID: \"9b5d0119-32c6-4587-994b-0d70198060ea\") " pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:04:43.608999 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.608972 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dq6ll" Apr 17 20:04:43.617527 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.617505 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8c6pz\"" Apr 17 20:04:43.625346 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.625323 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:04:43.735435 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.735381 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dq6ll"] Apr 17 20:04:43.738492 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:04:43.738461 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b977d53_172d_4a66_8807_758a1e1abc45.slice/crio-84cd22a4fe52c157d05025be272c62d3f49075d2a80a13f40ed000b2a6753283 WatchSource:0}: Error finding container 84cd22a4fe52c157d05025be272c62d3f49075d2a80a13f40ed000b2a6753283: Status 404 returned error can't find the container with id 84cd22a4fe52c157d05025be272c62d3f49075d2a80a13f40ed000b2a6753283 Apr 17 20:04:43.748402 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:43.748378 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lzj47"] Apr 17 20:04:43.751529 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:04:43.751507 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b5d0119_32c6_4587_994b_0d70198060ea.slice/crio-aaba1bf83df241ae9d354feae86a49cd7212a18028282e8d5c9267b9af5d323e WatchSource:0}: Error finding container aaba1bf83df241ae9d354feae86a49cd7212a18028282e8d5c9267b9af5d323e: Status 404 returned error can't find the container with id aaba1bf83df241ae9d354feae86a49cd7212a18028282e8d5c9267b9af5d323e Apr 17 20:04:44.042036 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:44.042002 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lzj47" event={"ID":"9b5d0119-32c6-4587-994b-0d70198060ea","Type":"ContainerStarted","Data":"aaba1bf83df241ae9d354feae86a49cd7212a18028282e8d5c9267b9af5d323e"} Apr 17 20:04:44.042952 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:44.042928 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dq6ll" event={"ID":"7b977d53-172d-4a66-8807-758a1e1abc45","Type":"ContainerStarted","Data":"84cd22a4fe52c157d05025be272c62d3f49075d2a80a13f40ed000b2a6753283"} Apr 17 20:04:47.053632 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:47.053589 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lzj47" event={"ID":"9b5d0119-32c6-4587-994b-0d70198060ea","Type":"ContainerStarted","Data":"8d41cce3c8cf2a820c7cf23652ca791b8e628be339c9e7e3cc8f36a77ee2be13"} Apr 17 20:04:47.054097 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:47.053779 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:04:47.070050 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:47.070004 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-lzj47" podStartSLOduration=67.257388576 podStartE2EDuration="1m10.069992446s" podCreationTimestamp="2026-04-17 20:03:37 +0000 UTC" firstStartedPulling="2026-04-17 20:04:43.753368028 +0000 UTC m=+66.564782897" lastFinishedPulling="2026-04-17 20:04:46.565971886 +0000 UTC m=+69.377386767" observedRunningTime="2026-04-17 20:04:47.06865125 +0000 UTC m=+69.880066138" watchObservedRunningTime="2026-04-17 20:04:47.069992446 +0000 UTC m=+69.881407375" Apr 17 20:04:49.061345 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:49.061307 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dq6ll" event={"ID":"7b977d53-172d-4a66-8807-758a1e1abc45","Type":"ContainerStarted","Data":"1e5de3429f019426d42bfc97e055675a3c1ab0cb320aa4c5dee87e8ecff5f3b6"} Apr 17 20:04:49.075560 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:04:49.075515 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dq6ll" podStartSLOduration=66.325034889 podStartE2EDuration="1m11.075501366s" podCreationTimestamp="2026-04-17 20:03:38 +0000 UTC" firstStartedPulling="2026-04-17 20:04:43.740230005 +0000 UTC m=+66.551644878" lastFinishedPulling="2026-04-17 20:04:48.490696475 +0000 UTC m=+71.302111355" observedRunningTime="2026-04-17 20:04:49.074608833 +0000 UTC m=+71.886023717" watchObservedRunningTime="2026-04-17 20:04:49.075501366 +0000 UTC m=+71.886916251" Apr 17 20:05:14.982759 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:14.982725 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:05:14.982759 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:14.982775 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:05:14.983175 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:14.982826 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:05:14.983175 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:14.982877 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:05:14.983175 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:14.982924 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:05:14.983175 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:14.982936 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 20:05:14.983175 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:14.982955 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57566cfc79-8bxjf: secret "image-registry-tls" not found Apr 17 20:05:14.983175 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:14.982957 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls podName:4ff6857d-6533-480b-ba95-f01666563ed0 nodeName:}" failed. No retries permitted until 2026-04-17 20:06:18.982936095 +0000 UTC m=+161.794350972 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls") pod "dns-default-755zw" (UID: "4ff6857d-6533-480b-ba95-f01666563ed0") : secret "dns-default-metrics-tls" not found Apr 17 20:05:14.983175 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:14.983027 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert podName:84ab49f7-96ea-40fb-b996-8b5492b23d01 nodeName:}" failed. No retries permitted until 2026-04-17 20:06:18.983007608 +0000 UTC m=+161.794422475 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert") pod "ingress-canary-zmjhn" (UID: "84ab49f7-96ea-40fb-b996-8b5492b23d01") : secret "canary-serving-cert" not found Apr 17 20:05:14.983175 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:14.983040 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls podName:2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4 nodeName:}" failed. No retries permitted until 2026-04-17 20:06:18.983033963 +0000 UTC m=+161.794448829 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls") pod "image-registry-57566cfc79-8bxjf" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4") : secret "image-registry-tls" not found Apr 17 20:05:18.060131 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:18.060103 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-lzj47" Apr 17 20:05:34.322789 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:34.322740 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m59bq_153e1d90-2d3e-4083-88f9-781771f16266/dns-node-resolver/0.log" Apr 17 20:05:34.923418 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:34.923390 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cgt9r_0286d3b0-9e6f-498c-b50c-69d5149b3f0d/node-ca/0.log" Apr 17 20:05:45.541818 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.541784 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-x5wz5"] Apr 17 20:05:45.544709 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.544689 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.546994 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.546970 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 20:05:45.547107 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.547009 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 20:05:45.547329 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.547314 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 20:05:45.548048 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.548032 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 20:05:45.548048 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.548040 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zcfv7\"" Apr 17 20:05:45.554725 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.554705 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x5wz5"] Apr 17 20:05:45.608306 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.608267 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/22c87446-981f-4f9a-8661-7b204afd155c-data-volume\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.608306 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.608309 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/22c87446-981f-4f9a-8661-7b204afd155c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.608527 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.608336 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbzp4\" (UniqueName: \"kubernetes.io/projected/22c87446-981f-4f9a-8661-7b204afd155c-kube-api-access-rbzp4\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.608527 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.608403 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/22c87446-981f-4f9a-8661-7b204afd155c-crio-socket\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.608527 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.608464 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.709241 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.709190 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/22c87446-981f-4f9a-8661-7b204afd155c-data-volume\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.709241 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.709243 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/22c87446-981f-4f9a-8661-7b204afd155c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.709467 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.709271 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbzp4\" (UniqueName: \"kubernetes.io/projected/22c87446-981f-4f9a-8661-7b204afd155c-kube-api-access-rbzp4\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.709467 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.709302 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/22c87446-981f-4f9a-8661-7b204afd155c-crio-socket\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.709467 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.709327 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.709467 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.709419 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/22c87446-981f-4f9a-8661-7b204afd155c-crio-socket\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.709467 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:45.709448 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 20:05:45.709647 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:45.709517 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls podName:22c87446-981f-4f9a-8661-7b204afd155c nodeName:}" failed. No retries permitted until 2026-04-17 20:05:46.209498847 +0000 UTC m=+129.020913710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-x5wz5" (UID: "22c87446-981f-4f9a-8661-7b204afd155c") : secret "insights-runtime-extractor-tls" not found Apr 17 20:05:45.709820 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.709803 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/22c87446-981f-4f9a-8661-7b204afd155c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.710153 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.710137 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/22c87446-981f-4f9a-8661-7b204afd155c-data-volume\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:45.717682 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:45.717657 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbzp4\" (UniqueName: \"kubernetes.io/projected/22c87446-981f-4f9a-8661-7b204afd155c-kube-api-access-rbzp4\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:46.212999 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:46.212959 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:46.213181 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:46.213117 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 20:05:46.213222 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:46.213191 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls podName:22c87446-981f-4f9a-8661-7b204afd155c nodeName:}" failed. No retries permitted until 2026-04-17 20:05:47.213172423 +0000 UTC m=+130.024587307 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-x5wz5" (UID: "22c87446-981f-4f9a-8661-7b204afd155c") : secret "insights-runtime-extractor-tls" not found Apr 17 20:05:47.219635 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:47.219598 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:47.220061 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:47.219789 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 20:05:47.220061 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:47.219859 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls podName:22c87446-981f-4f9a-8661-7b204afd155c nodeName:}" failed. No retries permitted until 2026-04-17 20:05:49.219844319 +0000 UTC m=+132.031259186 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-x5wz5" (UID: "22c87446-981f-4f9a-8661-7b204afd155c") : secret "insights-runtime-extractor-tls" not found Apr 17 20:05:47.622334 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:47.622301 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:05:47.622513 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:47.622446 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:05:47.622513 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:47.622511 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs podName:3ab88728-120f-4d07-91b8-97fe1307e061 nodeName:}" failed. No retries permitted until 2026-04-17 20:07:49.622495069 +0000 UTC m=+252.433909932 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs") pod "network-metrics-daemon-6vrjk" (UID: "3ab88728-120f-4d07-91b8-97fe1307e061") : secret "metrics-daemon-secret" not found Apr 17 20:05:49.235734 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:49.235692 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:49.236139 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:49.235855 2567 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 20:05:49.236139 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:05:49.235920 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls podName:22c87446-981f-4f9a-8661-7b204afd155c nodeName:}" failed. No retries permitted until 2026-04-17 20:05:53.235904627 +0000 UTC m=+136.047319495 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls") pod "insights-runtime-extractor-x5wz5" (UID: "22c87446-981f-4f9a-8661-7b204afd155c") : secret "insights-runtime-extractor-tls" not found Apr 17 20:05:53.269267 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:53.269238 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:53.271557 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:53.271538 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/22c87446-981f-4f9a-8661-7b204afd155c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x5wz5\" (UID: \"22c87446-981f-4f9a-8661-7b204afd155c\") " pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:53.353550 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:53.353525 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x5wz5" Apr 17 20:05:53.467835 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:53.467795 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x5wz5"] Apr 17 20:05:53.470514 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:05:53.470488 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22c87446_981f_4f9a_8661_7b204afd155c.slice/crio-038023cd08d8db34a03b20151c37b1886a42522a2f9ecd955a0837aaeec2c59a WatchSource:0}: Error finding container 038023cd08d8db34a03b20151c37b1886a42522a2f9ecd955a0837aaeec2c59a: Status 404 returned error can't find the container with id 038023cd08d8db34a03b20151c37b1886a42522a2f9ecd955a0837aaeec2c59a Apr 17 20:05:54.217150 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:54.217125 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x5wz5" event={"ID":"22c87446-981f-4f9a-8661-7b204afd155c","Type":"ContainerStarted","Data":"993384a7f469af4ed73395d2adf0e1ae391c531523d1165550659440d62d1285"} Apr 17 20:05:54.217248 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:54.217159 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x5wz5" event={"ID":"22c87446-981f-4f9a-8661-7b204afd155c","Type":"ContainerStarted","Data":"038023cd08d8db34a03b20151c37b1886a42522a2f9ecd955a0837aaeec2c59a"} Apr 17 20:05:55.221304 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:55.221268 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x5wz5" event={"ID":"22c87446-981f-4f9a-8661-7b204afd155c","Type":"ContainerStarted","Data":"656a099d22bd423722cf258dd6422b92eab57deec71b5e36408465bac778b40f"} Apr 17 20:05:56.227191 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:05:56.227148 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x5wz5" event={"ID":"22c87446-981f-4f9a-8661-7b204afd155c","Type":"ContainerStarted","Data":"a22cee3ff1fb368a73257e3edd81ada5ffa6d018046f72d4f1551cc85ee41e70"} Apr 17 20:06:12.009719 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.009668 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-x5wz5" podStartSLOduration=24.901518325 podStartE2EDuration="27.009653963s" podCreationTimestamp="2026-04-17 20:05:45 +0000 UTC" firstStartedPulling="2026-04-17 20:05:53.522918462 +0000 UTC m=+136.334333324" lastFinishedPulling="2026-04-17 20:05:55.631054093 +0000 UTC m=+138.442468962" observedRunningTime="2026-04-17 20:05:56.243119235 +0000 UTC m=+139.054534119" watchObservedRunningTime="2026-04-17 20:06:12.009653963 +0000 UTC m=+154.821068848" Apr 17 20:06:12.010197 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.009863 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-k27c8"] Apr 17 20:06:12.013154 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.013131 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.015287 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.015263 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 20:06:12.015545 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.015527 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 20:06:12.015652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.015543 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 20:06:12.015652 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.015561 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-ntvkb\"" Apr 17 20:06:12.016836 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.016820 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 20:06:12.016909 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.016841 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 20:06:12.016966 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.016956 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 20:06:12.112507 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.112473 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nstbb\" (UniqueName: \"kubernetes.io/projected/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-kube-api-access-nstbb\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.112633 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.112511 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-root\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.112633 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.112558 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-sys\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.112633 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.112596 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-wtmp\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.112742 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.112643 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-textfile\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.112742 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.112674 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-metrics-client-ca\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.112742 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.112728 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-accelerators-collector-config\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.112860 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.112752 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-tls\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.112860 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.112798 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213486 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213459 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nstbb\" (UniqueName: \"kubernetes.io/projected/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-kube-api-access-nstbb\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213632 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213496 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-root\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213632 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213518 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-sys\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213632 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213540 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-wtmp\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213632 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213573 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-textfile\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213632 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213588 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-root\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213632 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213609 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-sys\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213946 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-metrics-client-ca\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213946 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213693 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-accelerators-collector-config\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213946 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213727 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-tls\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213946 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213751 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213946 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213804 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-wtmp\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.213946 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.213907 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-textfile\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.214205 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.214184 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-metrics-client-ca\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.214308 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.214284 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-accelerators-collector-config\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.215967 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.215946 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.216084 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.216067 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-node-exporter-tls\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.221159 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.221141 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nstbb\" (UniqueName: \"kubernetes.io/projected/8c22a5a8-04ab-4e78-8ac1-d3248878d68e-kube-api-access-nstbb\") pod \"node-exporter-k27c8\" (UID: \"8c22a5a8-04ab-4e78-8ac1-d3248878d68e\") " pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.321661 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:12.321604 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k27c8" Apr 17 20:06:12.330359 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:06:12.330334 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c22a5a8_04ab_4e78_8ac1_d3248878d68e.slice/crio-9c470b6ac10cc8b55a1ef6ba8e6e071a1fb4efc47dca25bffdceb3155d126131 WatchSource:0}: Error finding container 9c470b6ac10cc8b55a1ef6ba8e6e071a1fb4efc47dca25bffdceb3155d126131: Status 404 returned error can't find the container with id 9c470b6ac10cc8b55a1ef6ba8e6e071a1fb4efc47dca25bffdceb3155d126131 Apr 17 20:06:13.269633 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:13.269604 2567 generic.go:358] "Generic (PLEG): container finished" podID="8c22a5a8-04ab-4e78-8ac1-d3248878d68e" containerID="ecb75c1abf9438cb74a85db66a20bad55c89578fd0a89bb8173a0ce9366184fd" exitCode=0 Apr 17 20:06:13.269940 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:13.269651 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k27c8" event={"ID":"8c22a5a8-04ab-4e78-8ac1-d3248878d68e","Type":"ContainerDied","Data":"ecb75c1abf9438cb74a85db66a20bad55c89578fd0a89bb8173a0ce9366184fd"} Apr 17 20:06:13.269940 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:13.269679 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k27c8" event={"ID":"8c22a5a8-04ab-4e78-8ac1-d3248878d68e","Type":"ContainerStarted","Data":"9c470b6ac10cc8b55a1ef6ba8e6e071a1fb4efc47dca25bffdceb3155d126131"} Apr 17 20:06:14.070425 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:06:14.070377 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" podUID="2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" Apr 17 20:06:14.076489 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:06:14.076461 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zmjhn" podUID="84ab49f7-96ea-40fb-b996-8b5492b23d01" Apr 17 20:06:14.086624 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:06:14.086596 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-755zw" podUID="4ff6857d-6533-480b-ba95-f01666563ed0" Apr 17 20:06:14.274060 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:14.274034 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:06:14.274379 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:14.274074 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:06:14.274379 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:14.274034 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k27c8" event={"ID":"8c22a5a8-04ab-4e78-8ac1-d3248878d68e","Type":"ContainerStarted","Data":"b90b6de93789df7e31def596016fb88d7923ea8a78772ffddd7bea0e744f22b6"} Apr 17 20:06:14.274379 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:14.274135 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k27c8" event={"ID":"8c22a5a8-04ab-4e78-8ac1-d3248878d68e","Type":"ContainerStarted","Data":"b22cdab8216e33fbc6d4fbe818705f6204110e952a6e87d218f99f662f8c678d"} Apr 17 20:06:14.294134 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:14.294090 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-k27c8" podStartSLOduration=2.5508026360000002 podStartE2EDuration="3.294080792s" podCreationTimestamp="2026-04-17 20:06:11 +0000 UTC" firstStartedPulling="2026-04-17 20:06:12.332527269 +0000 UTC m=+155.143942132" lastFinishedPulling="2026-04-17 20:06:13.075805421 +0000 UTC m=+155.887220288" observedRunningTime="2026-04-17 20:06:14.293912875 +0000 UTC m=+157.105327761" watchObservedRunningTime="2026-04-17 20:06:14.294080792 +0000 UTC m=+157.105495677" Apr 17 20:06:14.820666 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:06:14.820626 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-6vrjk" podUID="3ab88728-120f-4d07-91b8-97fe1307e061" Apr 17 20:06:19.068600 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.068554 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:06:19.069119 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.068624 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:06:19.069119 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.068651 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:06:19.071861 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.071816 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ff6857d-6533-480b-ba95-f01666563ed0-metrics-tls\") pod \"dns-default-755zw\" (UID: \"4ff6857d-6533-480b-ba95-f01666563ed0\") " pod="openshift-dns/dns-default-755zw" Apr 17 20:06:19.072019 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.071980 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") pod \"image-registry-57566cfc79-8bxjf\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:06:19.074376 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.074347 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84ab49f7-96ea-40fb-b996-8b5492b23d01-cert\") pod \"ingress-canary-zmjhn\" (UID: \"84ab49f7-96ea-40fb-b996-8b5492b23d01\") " pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:06:19.078276 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.078249 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tl4j2\"" Apr 17 20:06:19.078405 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.078249 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rkflr\"" Apr 17 20:06:19.085481 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.085453 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zmjhn" Apr 17 20:06:19.085580 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.085549 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:06:19.214908 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.214885 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57566cfc79-8bxjf"] Apr 17 20:06:19.217351 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:06:19.217322 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c4c1a4a_9198_4e12_8f13_0db4fd4b5ff4.slice/crio-94fcf9908fbe0a6a6e3ae0190b71285230375c2e4632b2a3f86722afc66cdc0c WatchSource:0}: Error finding container 94fcf9908fbe0a6a6e3ae0190b71285230375c2e4632b2a3f86722afc66cdc0c: Status 404 returned error can't find the container with id 94fcf9908fbe0a6a6e3ae0190b71285230375c2e4632b2a3f86722afc66cdc0c Apr 17 20:06:19.223642 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.223618 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zmjhn"] Apr 17 20:06:19.226320 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:06:19.226295 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ab49f7_96ea_40fb_b996_8b5492b23d01.slice/crio-c56263e4f5f2b3ee71d4d2314d6df43c21425278536d14a81ef68c5e3a3b48cb WatchSource:0}: Error finding container c56263e4f5f2b3ee71d4d2314d6df43c21425278536d14a81ef68c5e3a3b48cb: Status 404 returned error can't find the container with id c56263e4f5f2b3ee71d4d2314d6df43c21425278536d14a81ef68c5e3a3b48cb Apr 17 20:06:19.286801 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.286753 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zmjhn" event={"ID":"84ab49f7-96ea-40fb-b996-8b5492b23d01","Type":"ContainerStarted","Data":"c56263e4f5f2b3ee71d4d2314d6df43c21425278536d14a81ef68c5e3a3b48cb"} Apr 17 20:06:19.288013 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.287989 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" event={"ID":"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4","Type":"ContainerStarted","Data":"d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf"} Apr 17 20:06:19.288013 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.288013 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" event={"ID":"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4","Type":"ContainerStarted","Data":"94fcf9908fbe0a6a6e3ae0190b71285230375c2e4632b2a3f86722afc66cdc0c"} Apr 17 20:06:19.288187 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.288128 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:06:19.305687 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:19.305643 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" podStartSLOduration=161.305630569 podStartE2EDuration="2m41.305630569s" podCreationTimestamp="2026-04-17 20:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:06:19.304810897 +0000 UTC m=+162.116225785" watchObservedRunningTime="2026-04-17 20:06:19.305630569 +0000 UTC m=+162.117045453" Apr 17 20:06:20.983500 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:20.983440 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" podUID="206e65d1-c12e-41a7-88b3-b06e61138b95" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.6:8000/readyz\": dial tcp 10.132.0.6:8000: connect: connection refused" Apr 17 20:06:21.295095 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:21.295057 2567 generic.go:358] "Generic (PLEG): container finished" podID="206e65d1-c12e-41a7-88b3-b06e61138b95" containerID="b055d92c884ff3bb655648180c681d9ad08e819db44774a96b303de40aabca35" exitCode=1 Apr 17 20:06:21.295279 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:21.295130 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" event={"ID":"206e65d1-c12e-41a7-88b3-b06e61138b95","Type":"ContainerDied","Data":"b055d92c884ff3bb655648180c681d9ad08e819db44774a96b303de40aabca35"} Apr 17 20:06:21.295494 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:21.295473 2567 scope.go:117] "RemoveContainer" containerID="b055d92c884ff3bb655648180c681d9ad08e819db44774a96b303de40aabca35" Apr 17 20:06:21.296576 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:21.296555 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zmjhn" event={"ID":"84ab49f7-96ea-40fb-b996-8b5492b23d01","Type":"ContainerStarted","Data":"521bcd0183b2bab725a9356449cd437bd4f680a693016bd2894312192ce5987c"} Apr 17 20:06:21.297822 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:21.297800 2567 generic.go:358] "Generic (PLEG): container finished" podID="e856013c-9b3d-4e4a-9498-4fcefde1b527" containerID="c900d7a0f8dde4057a9f125c098b52c9d3bc73e7cb0ee8ed6d8c1eb8cae60d5e" exitCode=255 Apr 17 20:06:21.297877 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:21.297846 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" event={"ID":"e856013c-9b3d-4e4a-9498-4fcefde1b527","Type":"ContainerDied","Data":"c900d7a0f8dde4057a9f125c098b52c9d3bc73e7cb0ee8ed6d8c1eb8cae60d5e"} Apr 17 20:06:21.298158 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:21.298141 2567 scope.go:117] "RemoveContainer" containerID="c900d7a0f8dde4057a9f125c098b52c9d3bc73e7cb0ee8ed6d8c1eb8cae60d5e" Apr 17 20:06:21.313362 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:21.313339 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:06:21.324527 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:21.324483 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zmjhn" podStartSLOduration=129.671393124 podStartE2EDuration="2m11.324469089s" podCreationTimestamp="2026-04-17 20:04:10 +0000 UTC" firstStartedPulling="2026-04-17 20:06:19.228800708 +0000 UTC m=+162.040215571" lastFinishedPulling="2026-04-17 20:06:20.881876672 +0000 UTC m=+163.693291536" observedRunningTime="2026-04-17 20:06:21.323990171 +0000 UTC m=+164.135405050" watchObservedRunningTime="2026-04-17 20:06:21.324469089 +0000 UTC m=+164.135883973" Apr 17 20:06:21.357738 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:21.357711 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" Apr 17 20:06:22.305407 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:22.305371 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-c855cb67f-66tfk" event={"ID":"e856013c-9b3d-4e4a-9498-4fcefde1b527","Type":"ContainerStarted","Data":"26e7483468b316d085113ef6169573fa8d45773742115d6546b5f0eb813785ca"} Apr 17 20:06:22.306860 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:22.306832 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" event={"ID":"206e65d1-c12e-41a7-88b3-b06e61138b95","Type":"ContainerStarted","Data":"7293b929ad9f8ce652fe74c490a4b329f82630a20f0377e9ce787b1c1aa36024"} Apr 17 20:06:22.307117 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:22.307101 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:06:22.308171 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:22.308153 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68ddbbc475-ntvzv" Apr 17 20:06:25.720543 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:25.720463 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57566cfc79-8bxjf"] Apr 17 20:06:26.797548 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:26.797515 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:06:28.797406 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:28.797368 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-755zw" Apr 17 20:06:28.799987 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:28.799969 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4cbzc\"" Apr 17 20:06:28.808062 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:28.808037 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-755zw" Apr 17 20:06:28.926619 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:28.926589 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-755zw"] Apr 17 20:06:28.929345 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:06:28.929313 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ff6857d_6533_480b_ba95_f01666563ed0.slice/crio-3a4e5c32a7ef71f15750dad3c955c4ae2ee9756c532f8c91ca5921c6ca0c0261 WatchSource:0}: Error finding container 3a4e5c32a7ef71f15750dad3c955c4ae2ee9756c532f8c91ca5921c6ca0c0261: Status 404 returned error can't find the container with id 3a4e5c32a7ef71f15750dad3c955c4ae2ee9756c532f8c91ca5921c6ca0c0261 Apr 17 20:06:29.326114 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:29.326079 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-755zw" event={"ID":"4ff6857d-6533-480b-ba95-f01666563ed0","Type":"ContainerStarted","Data":"3a4e5c32a7ef71f15750dad3c955c4ae2ee9756c532f8c91ca5921c6ca0c0261"} Apr 17 20:06:31.332887 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:31.332852 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-755zw" event={"ID":"4ff6857d-6533-480b-ba95-f01666563ed0","Type":"ContainerStarted","Data":"c05eaf422415a0d7422feab64037f4113c18a63e93ec7d3dd742e94a87380b9e"} Apr 17 20:06:31.332887 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:31.332888 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-755zw" event={"ID":"4ff6857d-6533-480b-ba95-f01666563ed0","Type":"ContainerStarted","Data":"4ddb7afae5e609eb41d3922efdc66e4fd519f628eef94a47b4d096d41724f24a"} Apr 17 20:06:31.333271 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:31.332999 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-755zw" Apr 17 20:06:31.350064 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:31.350016 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-755zw" podStartSLOduration=139.898081758 podStartE2EDuration="2m21.350003032s" podCreationTimestamp="2026-04-17 20:04:10 +0000 UTC" firstStartedPulling="2026-04-17 20:06:28.931131838 +0000 UTC m=+171.742546705" lastFinishedPulling="2026-04-17 20:06:30.383053112 +0000 UTC m=+173.194467979" observedRunningTime="2026-04-17 20:06:31.348293016 +0000 UTC m=+174.159707900" watchObservedRunningTime="2026-04-17 20:06:31.350003032 +0000 UTC m=+174.161417980" Apr 17 20:06:35.725932 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:35.725903 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:06:41.337809 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:41.337781 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-755zw" Apr 17 20:06:50.738798 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:50.738720 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" podUID="2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" containerName="registry" containerID="cri-o://d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf" gracePeriod=30 Apr 17 20:06:50.978473 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:50.978450 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:06:51.107306 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.107219 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-certificates\") pod \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " Apr 17 20:06:51.107306 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.107256 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sxql\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-kube-api-access-7sxql\") pod \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " Apr 17 20:06:51.107306 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.107275 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") pod \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " Apr 17 20:06:51.107306 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.107307 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-bound-sa-token\") pod \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " Apr 17 20:06:51.107570 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.107422 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-installation-pull-secrets\") pod \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " Apr 17 20:06:51.107570 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.107472 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-trusted-ca\") pod \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " Apr 17 20:06:51.107570 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.107519 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-image-registry-private-configuration\") pod \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " Apr 17 20:06:51.107570 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.107552 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-ca-trust-extracted\") pod \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\" (UID: \"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4\") " Apr 17 20:06:51.107789 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.107711 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:06:51.107978 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.107940 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-certificates\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:06:51.108152 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.108064 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:06:51.110050 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.109970 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:06:51.110050 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.110002 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:06:51.110216 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.110112 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:06:51.110216 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.110149 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-kube-api-access-7sxql" (OuterVolumeSpecName: "kube-api-access-7sxql") pod "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4"). InnerVolumeSpecName "kube-api-access-7sxql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:06:51.110216 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.110176 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:06:51.116620 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.116597 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" (UID: "2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:06:51.208951 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.208908 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-trusted-ca\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:06:51.208951 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.208947 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-image-registry-private-configuration\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:06:51.208951 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.208960 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-ca-trust-extracted\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:06:51.208951 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.208969 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7sxql\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-kube-api-access-7sxql\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:06:51.209189 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.208979 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-registry-tls\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:06:51.209189 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.208989 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-bound-sa-token\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:06:51.209189 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.208997 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4-installation-pull-secrets\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:06:51.383948 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.383860 2567 generic.go:358] "Generic (PLEG): container finished" podID="2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" containerID="d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf" exitCode=0 Apr 17 20:06:51.383948 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.383923 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" Apr 17 20:06:51.383948 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.383938 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" event={"ID":"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4","Type":"ContainerDied","Data":"d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf"} Apr 17 20:06:51.384166 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.383967 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57566cfc79-8bxjf" event={"ID":"2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4","Type":"ContainerDied","Data":"94fcf9908fbe0a6a6e3ae0190b71285230375c2e4632b2a3f86722afc66cdc0c"} Apr 17 20:06:51.384166 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.383983 2567 scope.go:117] "RemoveContainer" containerID="d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf" Apr 17 20:06:51.392206 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.392162 2567 scope.go:117] "RemoveContainer" containerID="d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf" Apr 17 20:06:51.392473 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:06:51.392452 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf\": container with ID starting with d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf not found: ID does not exist" containerID="d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf" Apr 17 20:06:51.392538 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.392483 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf"} err="failed to get container status \"d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf\": rpc error: code = NotFound desc = could not find container \"d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf\": container with ID starting with d565f971c4e9bbd609f099d391a9396dd353edad10019d8fe171160921253daf not found: ID does not exist" Apr 17 20:06:51.404875 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.404849 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57566cfc79-8bxjf"] Apr 17 20:06:51.407737 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.407716 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-57566cfc79-8bxjf"] Apr 17 20:06:51.801853 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:51.801810 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" path="/var/lib/kubelet/pods/2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4/volumes" Apr 17 20:06:52.662662 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:06:52.662633 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zmjhn_84ab49f7-96ea-40fb-b996-8b5492b23d01/serve-healthcheck-canary/0.log" Apr 17 20:07:21.345846 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:21.345806 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" podUID="42d25788-0fa0-415b-814b-07fd37617909" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 20:07:31.345626 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:31.345583 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" podUID="42d25788-0fa0-415b-814b-07fd37617909" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 20:07:41.346087 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:41.346043 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" podUID="42d25788-0fa0-415b-814b-07fd37617909" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 20:07:41.346469 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:41.346118 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" Apr 17 20:07:41.346584 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:41.346566 2567 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"4888e72356e028c1b39a279499977922dfb0a2773c7812dda9b26b1a1bbb17af"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 20:07:41.346624 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:41.346604 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" podUID="42d25788-0fa0-415b-814b-07fd37617909" containerName="service-proxy" containerID="cri-o://4888e72356e028c1b39a279499977922dfb0a2773c7812dda9b26b1a1bbb17af" gracePeriod=30 Apr 17 20:07:41.516622 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:41.516592 2567 generic.go:358] "Generic (PLEG): container finished" podID="42d25788-0fa0-415b-814b-07fd37617909" containerID="4888e72356e028c1b39a279499977922dfb0a2773c7812dda9b26b1a1bbb17af" exitCode=2 Apr 17 20:07:41.516744 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:41.516659 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" event={"ID":"42d25788-0fa0-415b-814b-07fd37617909","Type":"ContainerDied","Data":"4888e72356e028c1b39a279499977922dfb0a2773c7812dda9b26b1a1bbb17af"} Apr 17 20:07:41.516744 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:41.516700 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f87669895-79jr4" event={"ID":"42d25788-0fa0-415b-814b-07fd37617909","Type":"ContainerStarted","Data":"a8a234cd0fb43c84188ffdeef66332f5a5b41951840bcaa1d3d54116c869134b"} Apr 17 20:07:49.638946 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:49.638909 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:07:49.641126 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:49.641104 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab88728-120f-4d07-91b8-97fe1307e061-metrics-certs\") pod \"network-metrics-daemon-6vrjk\" (UID: \"3ab88728-120f-4d07-91b8-97fe1307e061\") " pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:07:49.900431 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:49.900363 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x8jh2\"" Apr 17 20:07:49.908173 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:49.908154 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6vrjk" Apr 17 20:07:50.019805 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:50.019758 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6vrjk"] Apr 17 20:07:50.023342 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:07:50.023300 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ab88728_120f_4d07_91b8_97fe1307e061.slice/crio-95cf66465d786f7e3e0d871a2fd56292d16939990341e06c4c481ecb84472fdc WatchSource:0}: Error finding container 95cf66465d786f7e3e0d871a2fd56292d16939990341e06c4c481ecb84472fdc: Status 404 returned error can't find the container with id 95cf66465d786f7e3e0d871a2fd56292d16939990341e06c4c481ecb84472fdc Apr 17 20:07:50.541357 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:50.541312 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6vrjk" event={"ID":"3ab88728-120f-4d07-91b8-97fe1307e061","Type":"ContainerStarted","Data":"95cf66465d786f7e3e0d871a2fd56292d16939990341e06c4c481ecb84472fdc"} Apr 17 20:07:51.547144 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:51.547113 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6vrjk" event={"ID":"3ab88728-120f-4d07-91b8-97fe1307e061","Type":"ContainerStarted","Data":"e502e8a291fb706180c1d275e8c0d3969d4d0b43e091c6c43acb0d81fcebf6da"} Apr 17 20:07:51.547144 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:51.547147 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6vrjk" event={"ID":"3ab88728-120f-4d07-91b8-97fe1307e061","Type":"ContainerStarted","Data":"84256d21deecda0b6371cbdd3b2ee87a9cb3f9c7ea88a7d29e75c222973735b8"} Apr 17 20:07:51.562427 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:07:51.562383 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6vrjk" podStartSLOduration=253.534422829 podStartE2EDuration="4m14.562368413s" podCreationTimestamp="2026-04-17 20:03:37 +0000 UTC" firstStartedPulling="2026-04-17 20:07:50.02508065 +0000 UTC m=+252.836495514" lastFinishedPulling="2026-04-17 20:07:51.053026232 +0000 UTC m=+253.864441098" observedRunningTime="2026-04-17 20:07:51.561166338 +0000 UTC m=+254.372581223" watchObservedRunningTime="2026-04-17 20:07:51.562368413 +0000 UTC m=+254.373783373" Apr 17 20:08:37.728357 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:08:37.728328 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 20:10:00.036958 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.036924 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p"] Apr 17 20:10:00.037439 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.037138 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" containerName="registry" Apr 17 20:10:00.037439 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.037148 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" containerName="registry" Apr 17 20:10:00.037439 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.037195 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c4c1a4a-9198-4e12-8f13-0db4fd4b5ff4" containerName="registry" Apr 17 20:10:00.039821 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.039805 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p" Apr 17 20:10:00.042186 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.042164 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 20:10:00.042309 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.042187 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-jzgfq\"" Apr 17 20:10:00.042977 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.042961 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:10:00.047898 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.047876 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p"] Apr 17 20:10:00.149189 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.149148 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q69b7\" (UniqueName: \"kubernetes.io/projected/3975a812-dbe2-47f2-92e3-d5146537c825-kube-api-access-q69b7\") pod \"openshift-lws-operator-bfc7f696d-ldl6p\" (UID: \"3975a812-dbe2-47f2-92e3-d5146537c825\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p" Apr 17 20:10:00.149414 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.149205 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3975a812-dbe2-47f2-92e3-d5146537c825-tmp\") pod \"openshift-lws-operator-bfc7f696d-ldl6p\" (UID: \"3975a812-dbe2-47f2-92e3-d5146537c825\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p" Apr 17 20:10:00.250370 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.250331 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q69b7\" (UniqueName: \"kubernetes.io/projected/3975a812-dbe2-47f2-92e3-d5146537c825-kube-api-access-q69b7\") pod \"openshift-lws-operator-bfc7f696d-ldl6p\" (UID: \"3975a812-dbe2-47f2-92e3-d5146537c825\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p" Apr 17 20:10:00.250370 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.250375 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3975a812-dbe2-47f2-92e3-d5146537c825-tmp\") pod \"openshift-lws-operator-bfc7f696d-ldl6p\" (UID: \"3975a812-dbe2-47f2-92e3-d5146537c825\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p" Apr 17 20:10:00.250693 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.250678 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3975a812-dbe2-47f2-92e3-d5146537c825-tmp\") pod \"openshift-lws-operator-bfc7f696d-ldl6p\" (UID: \"3975a812-dbe2-47f2-92e3-d5146537c825\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p" Apr 17 20:10:00.258425 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.258396 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q69b7\" (UniqueName: \"kubernetes.io/projected/3975a812-dbe2-47f2-92e3-d5146537c825-kube-api-access-q69b7\") pod \"openshift-lws-operator-bfc7f696d-ldl6p\" (UID: \"3975a812-dbe2-47f2-92e3-d5146537c825\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p" Apr 17 20:10:00.348942 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.348851 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p" Apr 17 20:10:00.461115 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.461080 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p"] Apr 17 20:10:00.463876 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:10:00.463848 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3975a812_dbe2_47f2_92e3_d5146537c825.slice/crio-b73086d73b6c35e0db97ba6d06cfda847bdf18936de2d10b854b8f20c27a8330 WatchSource:0}: Error finding container b73086d73b6c35e0db97ba6d06cfda847bdf18936de2d10b854b8f20c27a8330: Status 404 returned error can't find the container with id b73086d73b6c35e0db97ba6d06cfda847bdf18936de2d10b854b8f20c27a8330 Apr 17 20:10:00.465257 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.465241 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:10:00.862827 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:00.862791 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p" event={"ID":"3975a812-dbe2-47f2-92e3-d5146537c825","Type":"ContainerStarted","Data":"b73086d73b6c35e0db97ba6d06cfda847bdf18936de2d10b854b8f20c27a8330"} Apr 17 20:10:03.873075 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:03.873040 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p" event={"ID":"3975a812-dbe2-47f2-92e3-d5146537c825","Type":"ContainerStarted","Data":"4646a87cedc82b488af7516fbe1c5d723797a001aea703329897c30d40ac1091"} Apr 17 20:10:03.888639 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:03.888584 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ldl6p" podStartSLOduration=1.536870184 podStartE2EDuration="3.888568133s" podCreationTimestamp="2026-04-17 20:10:00 +0000 UTC" firstStartedPulling="2026-04-17 20:10:00.465367649 +0000 UTC m=+383.276782511" lastFinishedPulling="2026-04-17 20:10:02.817065592 +0000 UTC m=+385.628480460" observedRunningTime="2026-04-17 20:10:03.888284469 +0000 UTC m=+386.699699353" watchObservedRunningTime="2026-04-17 20:10:03.888568133 +0000 UTC m=+386.699983019" Apr 17 20:10:22.899030 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:22.898977 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5"] Apr 17 20:10:22.902495 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:22.902475 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:22.905303 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:22.905279 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 20:10:22.905709 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:22.905687 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-csxrb\"" Apr 17 20:10:22.905817 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:22.905801 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 20:10:22.905978 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:22.905959 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 20:10:22.906048 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:22.905992 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 20:10:22.916817 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:22.916746 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5"] Apr 17 20:10:23.007249 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:23.007214 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/322436b5-ab0a-49e8-a03a-275c8cf8bb07-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5\" (UID: \"322436b5-ab0a-49e8-a03a-275c8cf8bb07\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:23.007249 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:23.007255 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-269fs\" (UniqueName: \"kubernetes.io/projected/322436b5-ab0a-49e8-a03a-275c8cf8bb07-kube-api-access-269fs\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5\" (UID: \"322436b5-ab0a-49e8-a03a-275c8cf8bb07\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:23.007483 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:23.007295 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/322436b5-ab0a-49e8-a03a-275c8cf8bb07-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5\" (UID: \"322436b5-ab0a-49e8-a03a-275c8cf8bb07\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:23.108563 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:23.108518 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/322436b5-ab0a-49e8-a03a-275c8cf8bb07-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5\" (UID: \"322436b5-ab0a-49e8-a03a-275c8cf8bb07\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:23.108757 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:23.108569 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-269fs\" (UniqueName: \"kubernetes.io/projected/322436b5-ab0a-49e8-a03a-275c8cf8bb07-kube-api-access-269fs\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5\" (UID: \"322436b5-ab0a-49e8-a03a-275c8cf8bb07\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:23.108757 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:23.108621 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/322436b5-ab0a-49e8-a03a-275c8cf8bb07-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5\" (UID: \"322436b5-ab0a-49e8-a03a-275c8cf8bb07\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:23.110995 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:23.110965 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/322436b5-ab0a-49e8-a03a-275c8cf8bb07-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5\" (UID: \"322436b5-ab0a-49e8-a03a-275c8cf8bb07\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:23.111096 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:23.111024 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/322436b5-ab0a-49e8-a03a-275c8cf8bb07-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5\" (UID: \"322436b5-ab0a-49e8-a03a-275c8cf8bb07\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:23.119709 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:23.119685 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-269fs\" (UniqueName: \"kubernetes.io/projected/322436b5-ab0a-49e8-a03a-275c8cf8bb07-kube-api-access-269fs\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5\" (UID: \"322436b5-ab0a-49e8-a03a-275c8cf8bb07\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:23.213222 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:23.213129 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:23.332504 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:23.332465 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5"] Apr 17 20:10:23.335999 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:10:23.335970 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod322436b5_ab0a_49e8_a03a_275c8cf8bb07.slice/crio-a8f71d4562a30a7d98d545e3516f8be92c79cfb1ff1a7ff1858376716a868ead WatchSource:0}: Error finding container a8f71d4562a30a7d98d545e3516f8be92c79cfb1ff1a7ff1858376716a868ead: Status 404 returned error can't find the container with id a8f71d4562a30a7d98d545e3516f8be92c79cfb1ff1a7ff1858376716a868ead Apr 17 20:10:23.922503 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:23.922462 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" event={"ID":"322436b5-ab0a-49e8-a03a-275c8cf8bb07","Type":"ContainerStarted","Data":"a8f71d4562a30a7d98d545e3516f8be92c79cfb1ff1a7ff1858376716a868ead"} Apr 17 20:10:25.929594 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:25.929546 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" event={"ID":"322436b5-ab0a-49e8-a03a-275c8cf8bb07","Type":"ContainerStarted","Data":"bada28d28267d97ef0cc15110c134b3427be95ec8c0a3903365af423d67f8244"} Apr 17 20:10:25.930076 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:25.930020 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:25.951386 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:25.951332 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" podStartSLOduration=1.564112854 podStartE2EDuration="3.951317223s" podCreationTimestamp="2026-04-17 20:10:22 +0000 UTC" firstStartedPulling="2026-04-17 20:10:23.33964346 +0000 UTC m=+406.151058327" lastFinishedPulling="2026-04-17 20:10:25.726847829 +0000 UTC m=+408.538262696" observedRunningTime="2026-04-17 20:10:25.950109113 +0000 UTC m=+408.761523997" watchObservedRunningTime="2026-04-17 20:10:25.951317223 +0000 UTC m=+408.762732111" Apr 17 20:10:36.934639 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:36.934603 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5" Apr 17 20:10:40.503263 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.503227 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs"] Apr 17 20:10:40.510547 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.510524 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" Apr 17 20:10:40.512932 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.512904 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 20:10:40.513057 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.513001 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 20:10:40.513057 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.513049 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 20:10:40.513167 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.513117 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 20:10:40.513335 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.513319 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-lx7ns\"" Apr 17 20:10:40.515449 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.515428 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs"] Apr 17 20:10:40.536620 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.536592 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l97qm\" (UniqueName: \"kubernetes.io/projected/d1acc34f-386a-4d98-a4ae-63572849c747-kube-api-access-l97qm\") pod \"kube-auth-proxy-6568cc58bc-nccqs\" (UID: \"d1acc34f-386a-4d98-a4ae-63572849c747\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" Apr 17 20:10:40.536794 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.536626 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d1acc34f-386a-4d98-a4ae-63572849c747-tmp\") pod \"kube-auth-proxy-6568cc58bc-nccqs\" (UID: \"d1acc34f-386a-4d98-a4ae-63572849c747\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" Apr 17 20:10:40.536794 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.536672 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1acc34f-386a-4d98-a4ae-63572849c747-tls-certs\") pod \"kube-auth-proxy-6568cc58bc-nccqs\" (UID: \"d1acc34f-386a-4d98-a4ae-63572849c747\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" Apr 17 20:10:40.637898 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.637860 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l97qm\" (UniqueName: \"kubernetes.io/projected/d1acc34f-386a-4d98-a4ae-63572849c747-kube-api-access-l97qm\") pod \"kube-auth-proxy-6568cc58bc-nccqs\" (UID: \"d1acc34f-386a-4d98-a4ae-63572849c747\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" Apr 17 20:10:40.637898 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.637901 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d1acc34f-386a-4d98-a4ae-63572849c747-tmp\") pod \"kube-auth-proxy-6568cc58bc-nccqs\" (UID: \"d1acc34f-386a-4d98-a4ae-63572849c747\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" Apr 17 20:10:40.638135 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.637928 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1acc34f-386a-4d98-a4ae-63572849c747-tls-certs\") pod \"kube-auth-proxy-6568cc58bc-nccqs\" (UID: \"d1acc34f-386a-4d98-a4ae-63572849c747\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" Apr 17 20:10:40.640117 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.640083 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d1acc34f-386a-4d98-a4ae-63572849c747-tmp\") pod \"kube-auth-proxy-6568cc58bc-nccqs\" (UID: \"d1acc34f-386a-4d98-a4ae-63572849c747\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" Apr 17 20:10:40.640356 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.640337 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1acc34f-386a-4d98-a4ae-63572849c747-tls-certs\") pod \"kube-auth-proxy-6568cc58bc-nccqs\" (UID: \"d1acc34f-386a-4d98-a4ae-63572849c747\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" Apr 17 20:10:40.647208 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.647187 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l97qm\" (UniqueName: \"kubernetes.io/projected/d1acc34f-386a-4d98-a4ae-63572849c747-kube-api-access-l97qm\") pod \"kube-auth-proxy-6568cc58bc-nccqs\" (UID: \"d1acc34f-386a-4d98-a4ae-63572849c747\") " pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" Apr 17 20:10:40.820751 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.820660 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" Apr 17 20:10:40.938295 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.938266 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs"] Apr 17 20:10:40.940984 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:10:40.940955 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1acc34f_386a_4d98_a4ae_63572849c747.slice/crio-1c9e20d848a25286c9c97870e6834b84a6b3cd26d1b0f2d569d1987b3547b929 WatchSource:0}: Error finding container 1c9e20d848a25286c9c97870e6834b84a6b3cd26d1b0f2d569d1987b3547b929: Status 404 returned error can't find the container with id 1c9e20d848a25286c9c97870e6834b84a6b3cd26d1b0f2d569d1987b3547b929 Apr 17 20:10:40.969726 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:40.969692 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" event={"ID":"d1acc34f-386a-4d98-a4ae-63572849c747","Type":"ContainerStarted","Data":"1c9e20d848a25286c9c97870e6834b84a6b3cd26d1b0f2d569d1987b3547b929"} Apr 17 20:10:43.768021 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:43.766194 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-jv6nv"] Apr 17 20:10:43.770293 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:43.770261 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:10:43.772554 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:43.772530 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 20:10:43.772554 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:43.772550 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-8ftv8\"" Apr 17 20:10:43.776259 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:43.776234 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-jv6nv"] Apr 17 20:10:43.859545 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:43.859512 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krdqm\" (UniqueName: \"kubernetes.io/projected/127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138-kube-api-access-krdqm\") pod \"odh-model-controller-858dbf95b8-jv6nv\" (UID: \"127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138\") " pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:10:43.859721 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:43.859570 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138-cert\") pod \"odh-model-controller-858dbf95b8-jv6nv\" (UID: \"127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138\") " pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:10:43.960445 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:43.960408 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138-cert\") pod \"odh-model-controller-858dbf95b8-jv6nv\" (UID: \"127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138\") " pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:10:43.960629 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:43.960475 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krdqm\" (UniqueName: \"kubernetes.io/projected/127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138-kube-api-access-krdqm\") pod \"odh-model-controller-858dbf95b8-jv6nv\" (UID: \"127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138\") " pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:10:43.960629 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:10:43.960570 2567 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 20:10:43.960834 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:10:43.960674 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138-cert podName:127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138 nodeName:}" failed. No retries permitted until 2026-04-17 20:10:44.460647804 +0000 UTC m=+427.272062668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138-cert") pod "odh-model-controller-858dbf95b8-jv6nv" (UID: "127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138") : secret "odh-model-controller-webhook-cert" not found Apr 17 20:10:43.969447 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:43.969407 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krdqm\" (UniqueName: \"kubernetes.io/projected/127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138-kube-api-access-krdqm\") pod \"odh-model-controller-858dbf95b8-jv6nv\" (UID: \"127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138\") " pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:10:44.464619 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:44.464574 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138-cert\") pod \"odh-model-controller-858dbf95b8-jv6nv\" (UID: \"127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138\") " pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:10:44.467779 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:44.467738 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138-cert\") pod \"odh-model-controller-858dbf95b8-jv6nv\" (UID: \"127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138\") " pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:10:44.681237 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:44.681191 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:10:44.805939 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:44.805904 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-jv6nv"] Apr 17 20:10:44.809284 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:10:44.809253 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod127cbcb7_41bb_4f5e_8e0c_cb8a3dffe138.slice/crio-e607798a187a18052774e88ea2ccb2ba6b7a3ce5836a8eb9cc82a55b27ccf0e2 WatchSource:0}: Error finding container e607798a187a18052774e88ea2ccb2ba6b7a3ce5836a8eb9cc82a55b27ccf0e2: Status 404 returned error can't find the container with id e607798a187a18052774e88ea2ccb2ba6b7a3ce5836a8eb9cc82a55b27ccf0e2 Apr 17 20:10:44.983234 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:44.983140 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" event={"ID":"d1acc34f-386a-4d98-a4ae-63572849c747","Type":"ContainerStarted","Data":"392bc77dc59c04654e885c072a76ff81b7adeee933e20d87e859c8b03cf32832"} Apr 17 20:10:44.984248 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:44.984219 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" event={"ID":"127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138","Type":"ContainerStarted","Data":"e607798a187a18052774e88ea2ccb2ba6b7a3ce5836a8eb9cc82a55b27ccf0e2"} Apr 17 20:10:44.998260 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:44.998200 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6568cc58bc-nccqs" podStartSLOduration=1.358254361 podStartE2EDuration="4.998181167s" podCreationTimestamp="2026-04-17 20:10:40 +0000 UTC" firstStartedPulling="2026-04-17 20:10:40.943035635 +0000 UTC m=+423.754450511" lastFinishedPulling="2026-04-17 20:10:44.58296245 +0000 UTC m=+427.394377317" observedRunningTime="2026-04-17 20:10:44.998040944 +0000 UTC m=+427.809455829" watchObservedRunningTime="2026-04-17 20:10:44.998181167 +0000 UTC m=+427.809596053" Apr 17 20:10:47.994987 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:47.994954 2567 generic.go:358] "Generic (PLEG): container finished" podID="127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138" containerID="633f5ce96935e69ba50b585693de49f72736151eb57777542b95f43fde60284e" exitCode=1 Apr 17 20:10:47.995402 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:47.995028 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" event={"ID":"127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138","Type":"ContainerDied","Data":"633f5ce96935e69ba50b585693de49f72736151eb57777542b95f43fde60284e"} Apr 17 20:10:47.995402 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:47.995215 2567 scope.go:117] "RemoveContainer" containerID="633f5ce96935e69ba50b585693de49f72736151eb57777542b95f43fde60284e" Apr 17 20:10:48.999660 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:48.999618 2567 generic.go:358] "Generic (PLEG): container finished" podID="127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138" containerID="2c1cb0dc8d37f768a382c4f0e4fe01646afa2e56232af50bd0b57dd41d4fb6ad" exitCode=1 Apr 17 20:10:49.000101 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:48.999709 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" event={"ID":"127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138","Type":"ContainerDied","Data":"2c1cb0dc8d37f768a382c4f0e4fe01646afa2e56232af50bd0b57dd41d4fb6ad"} Apr 17 20:10:49.000101 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:48.999758 2567 scope.go:117] "RemoveContainer" containerID="633f5ce96935e69ba50b585693de49f72736151eb57777542b95f43fde60284e" Apr 17 20:10:49.000101 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:48.999934 2567 scope.go:117] "RemoveContainer" containerID="2c1cb0dc8d37f768a382c4f0e4fe01646afa2e56232af50bd0b57dd41d4fb6ad" Apr 17 20:10:49.000219 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:10:49.000150 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-jv6nv_opendatahub(127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138)\"" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" podUID="127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138" Apr 17 20:10:50.004147 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.004118 2567 scope.go:117] "RemoveContainer" containerID="2c1cb0dc8d37f768a382c4f0e4fe01646afa2e56232af50bd0b57dd41d4fb6ad" Apr 17 20:10:50.004524 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:10:50.004291 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-jv6nv_opendatahub(127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138)\"" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" podUID="127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138" Apr 17 20:10:50.570598 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.570562 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw"] Apr 17 20:10:50.574881 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.574863 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" Apr 17 20:10:50.577276 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.577252 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 20:10:50.577462 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.577446 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-hbdgm\"" Apr 17 20:10:50.577523 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.577499 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 20:10:50.586273 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.586252 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw"] Apr 17 20:10:50.609118 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.609087 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/8580e5cd-32cd-4129-ab62-fc6158434747-operator-config\") pod \"servicemesh-operator3-55f49c5f94-2lxmw\" (UID: \"8580e5cd-32cd-4129-ab62-fc6158434747\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" Apr 17 20:10:50.609266 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.609133 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m6fk\" (UniqueName: \"kubernetes.io/projected/8580e5cd-32cd-4129-ab62-fc6158434747-kube-api-access-4m6fk\") pod \"servicemesh-operator3-55f49c5f94-2lxmw\" (UID: \"8580e5cd-32cd-4129-ab62-fc6158434747\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" Apr 17 20:10:50.709938 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.709901 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/8580e5cd-32cd-4129-ab62-fc6158434747-operator-config\") pod \"servicemesh-operator3-55f49c5f94-2lxmw\" (UID: \"8580e5cd-32cd-4129-ab62-fc6158434747\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" Apr 17 20:10:50.710151 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.709947 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m6fk\" (UniqueName: \"kubernetes.io/projected/8580e5cd-32cd-4129-ab62-fc6158434747-kube-api-access-4m6fk\") pod \"servicemesh-operator3-55f49c5f94-2lxmw\" (UID: \"8580e5cd-32cd-4129-ab62-fc6158434747\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" Apr 17 20:10:50.712497 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.712468 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/8580e5cd-32cd-4129-ab62-fc6158434747-operator-config\") pod \"servicemesh-operator3-55f49c5f94-2lxmw\" (UID: \"8580e5cd-32cd-4129-ab62-fc6158434747\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" Apr 17 20:10:50.720869 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.720838 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m6fk\" (UniqueName: \"kubernetes.io/projected/8580e5cd-32cd-4129-ab62-fc6158434747-kube-api-access-4m6fk\") pod \"servicemesh-operator3-55f49c5f94-2lxmw\" (UID: \"8580e5cd-32cd-4129-ab62-fc6158434747\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" Apr 17 20:10:50.883983 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:50.883886 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" Apr 17 20:10:51.036296 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:51.036258 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw"] Apr 17 20:10:51.042805 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:10:51.042755 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8580e5cd_32cd_4129_ab62_fc6158434747.slice/crio-0ba1ca40131cc3328cb66da30c40117ed586eff38030894208836bcc98b3bf8c WatchSource:0}: Error finding container 0ba1ca40131cc3328cb66da30c40117ed586eff38030894208836bcc98b3bf8c: Status 404 returned error can't find the container with id 0ba1ca40131cc3328cb66da30c40117ed586eff38030894208836bcc98b3bf8c Apr 17 20:10:51.931898 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:51.931864 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-j28lg"] Apr 17 20:10:51.936479 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:51.936454 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" Apr 17 20:10:51.939608 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:51.939583 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-bm84b\"" Apr 17 20:10:51.940343 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:51.940323 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 20:10:51.959177 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:51.959145 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-j28lg"] Apr 17 20:10:52.011851 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:52.011816 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" event={"ID":"8580e5cd-32cd-4129-ab62-fc6158434747","Type":"ContainerStarted","Data":"0ba1ca40131cc3328cb66da30c40117ed586eff38030894208836bcc98b3bf8c"} Apr 17 20:10:52.020255 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:52.020222 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd5wl\" (UniqueName: \"kubernetes.io/projected/3b7a3096-84cc-4ab5-a245-b553e829d363-kube-api-access-xd5wl\") pod \"kserve-controller-manager-856948b99f-j28lg\" (UID: \"3b7a3096-84cc-4ab5-a245-b553e829d363\") " pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" Apr 17 20:10:52.020385 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:52.020275 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b7a3096-84cc-4ab5-a245-b553e829d363-cert\") pod \"kserve-controller-manager-856948b99f-j28lg\" (UID: \"3b7a3096-84cc-4ab5-a245-b553e829d363\") " pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" Apr 17 20:10:52.120755 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:52.120719 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b7a3096-84cc-4ab5-a245-b553e829d363-cert\") pod \"kserve-controller-manager-856948b99f-j28lg\" (UID: \"3b7a3096-84cc-4ab5-a245-b553e829d363\") " pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" Apr 17 20:10:52.121251 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:52.120832 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xd5wl\" (UniqueName: \"kubernetes.io/projected/3b7a3096-84cc-4ab5-a245-b553e829d363-kube-api-access-xd5wl\") pod \"kserve-controller-manager-856948b99f-j28lg\" (UID: \"3b7a3096-84cc-4ab5-a245-b553e829d363\") " pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" Apr 17 20:10:52.121251 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:10:52.120930 2567 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 20:10:52.121251 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:10:52.121026 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b7a3096-84cc-4ab5-a245-b553e829d363-cert podName:3b7a3096-84cc-4ab5-a245-b553e829d363 nodeName:}" failed. No retries permitted until 2026-04-17 20:10:52.620999324 +0000 UTC m=+435.432414205 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b7a3096-84cc-4ab5-a245-b553e829d363-cert") pod "kserve-controller-manager-856948b99f-j28lg" (UID: "3b7a3096-84cc-4ab5-a245-b553e829d363") : secret "kserve-webhook-server-cert" not found Apr 17 20:10:52.138643 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:52.138606 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd5wl\" (UniqueName: \"kubernetes.io/projected/3b7a3096-84cc-4ab5-a245-b553e829d363-kube-api-access-xd5wl\") pod \"kserve-controller-manager-856948b99f-j28lg\" (UID: \"3b7a3096-84cc-4ab5-a245-b553e829d363\") " pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" Apr 17 20:10:52.626061 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:52.625985 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b7a3096-84cc-4ab5-a245-b553e829d363-cert\") pod \"kserve-controller-manager-856948b99f-j28lg\" (UID: \"3b7a3096-84cc-4ab5-a245-b553e829d363\") " pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" Apr 17 20:10:52.628627 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:52.628595 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b7a3096-84cc-4ab5-a245-b553e829d363-cert\") pod \"kserve-controller-manager-856948b99f-j28lg\" (UID: \"3b7a3096-84cc-4ab5-a245-b553e829d363\") " pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" Apr 17 20:10:52.847810 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:52.847744 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" Apr 17 20:10:52.973783 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:52.973705 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-j28lg"] Apr 17 20:10:52.976911 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:10:52.976883 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b7a3096_84cc_4ab5_a245_b553e829d363.slice/crio-6d9b3d171785a72ff626b56cf748cd3fe17ba5b118936b151c706b6710b827c6 WatchSource:0}: Error finding container 6d9b3d171785a72ff626b56cf748cd3fe17ba5b118936b151c706b6710b827c6: Status 404 returned error can't find the container with id 6d9b3d171785a72ff626b56cf748cd3fe17ba5b118936b151c706b6710b827c6 Apr 17 20:10:53.017036 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:53.016985 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" event={"ID":"3b7a3096-84cc-4ab5-a245-b553e829d363","Type":"ContainerStarted","Data":"6d9b3d171785a72ff626b56cf748cd3fe17ba5b118936b151c706b6710b827c6"} Apr 17 20:10:54.681739 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:54.681664 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:10:54.682114 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:54.682083 2567 scope.go:117] "RemoveContainer" containerID="2c1cb0dc8d37f768a382c4f0e4fe01646afa2e56232af50bd0b57dd41d4fb6ad" Apr 17 20:10:54.682303 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:10:54.682284 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-jv6nv_opendatahub(127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138)\"" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" podUID="127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138" Apr 17 20:10:57.033135 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.033095 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" event={"ID":"8580e5cd-32cd-4129-ab62-fc6158434747","Type":"ContainerStarted","Data":"3c2a23a0eb61726d79c1f7534be70b9fbcce0b9ed4d1f104bb70228a7f1f2383"} Apr 17 20:10:57.033637 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.033256 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" Apr 17 20:10:57.034649 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.034625 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" event={"ID":"3b7a3096-84cc-4ab5-a245-b553e829d363","Type":"ContainerStarted","Data":"bc00593fc6d081c5c124f1deec24e2a419ac8a5580ab68eac3be80693f5f065e"} Apr 17 20:10:57.034814 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.034796 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" Apr 17 20:10:57.064200 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.064148 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" podStartSLOduration=1.978436602 podStartE2EDuration="7.064130934s" podCreationTimestamp="2026-04-17 20:10:50 +0000 UTC" firstStartedPulling="2026-04-17 20:10:51.046229536 +0000 UTC m=+433.857644399" lastFinishedPulling="2026-04-17 20:10:56.131923865 +0000 UTC m=+438.943338731" observedRunningTime="2026-04-17 20:10:57.062160034 +0000 UTC m=+439.873574918" watchObservedRunningTime="2026-04-17 20:10:57.064130934 +0000 UTC m=+439.875545818" Apr 17 20:10:57.078401 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.078345 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" podStartSLOduration=2.922441948 podStartE2EDuration="6.078328014s" podCreationTimestamp="2026-04-17 20:10:51 +0000 UTC" firstStartedPulling="2026-04-17 20:10:52.978303562 +0000 UTC m=+435.789718425" lastFinishedPulling="2026-04-17 20:10:56.134189625 +0000 UTC m=+438.945604491" observedRunningTime="2026-04-17 20:10:57.078130257 +0000 UTC m=+439.889545143" watchObservedRunningTime="2026-04-17 20:10:57.078328014 +0000 UTC m=+439.889742928" Apr 17 20:10:57.136386 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.136345 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq"] Apr 17 20:10:57.139951 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.139924 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.142445 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.142416 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 20:10:57.142584 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.142446 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 20:10:57.142641 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.142599 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 20:10:57.142714 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.142661 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-nh9xr\"" Apr 17 20:10:57.142938 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.142904 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 20:10:57.150133 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.150107 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq"] Apr 17 20:10:57.166408 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.166376 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.166568 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.166414 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.166568 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.166485 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-724cl\" (UniqueName: \"kubernetes.io/projected/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-kube-api-access-724cl\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.166568 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.166549 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.166673 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.166589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.166673 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.166628 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.166746 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.166685 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.267372 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.267334 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.267372 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.267372 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.267625 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.267393 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.267625 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.267490 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-724cl\" (UniqueName: \"kubernetes.io/projected/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-kube-api-access-724cl\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.267625 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.267522 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.267625 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.267564 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.267625 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.267619 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.268100 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.268070 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.269837 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.269815 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.269940 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.269817 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.270181 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.270163 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.270298 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.270273 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.274911 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.274891 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.275366 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.275344 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-724cl\" (UniqueName: \"kubernetes.io/projected/5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1-kube-api-access-724cl\") pod \"istiod-openshift-gateway-55ff986f96-szktq\" (UID: \"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.450700 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.450592 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:10:57.585042 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:57.585004 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq"] Apr 17 20:10:58.039596 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:10:58.039547 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" event={"ID":"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1","Type":"ContainerStarted","Data":"12cbbbb97db98bb7b641f9e2529419ec72bb898113afbf22997224e6da7175f1"} Apr 17 20:11:00.818475 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:00.818436 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 20:11:00.818739 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:00.818508 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 20:11:01.051106 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:01.051065 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" event={"ID":"5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1","Type":"ContainerStarted","Data":"0d977ccada5697462d7f87a3919e263501b0374fe2374e87b4aeb7f377946d24"} Apr 17 20:11:01.051355 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:01.051288 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:11:01.052748 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:01.052722 2567 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-szktq container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 17 20:11:01.052885 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:01.052786 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" podUID="5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:11:01.071847 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:01.071706 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" podStartSLOduration=0.849708564 podStartE2EDuration="4.071690766s" podCreationTimestamp="2026-04-17 20:10:57 +0000 UTC" firstStartedPulling="2026-04-17 20:10:57.596239146 +0000 UTC m=+440.407654009" lastFinishedPulling="2026-04-17 20:11:00.818221344 +0000 UTC m=+443.629636211" observedRunningTime="2026-04-17 20:11:01.070874827 +0000 UTC m=+443.882289710" watchObservedRunningTime="2026-04-17 20:11:01.071690766 +0000 UTC m=+443.883105652" Apr 17 20:11:02.056164 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:02.056120 2567 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-szktq container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 17 20:11:02.056686 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:02.056182 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" podUID="5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 20:11:04.682198 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:04.682148 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:11:04.682709 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:04.682551 2567 scope.go:117] "RemoveContainer" containerID="2c1cb0dc8d37f768a382c4f0e4fe01646afa2e56232af50bd0b57dd41d4fb6ad" Apr 17 20:11:05.056801 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:05.056752 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-szktq" Apr 17 20:11:05.065759 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:05.065731 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" event={"ID":"127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138","Type":"ContainerStarted","Data":"583bf25732ac1d4b07beaa14d4d661ea5d6d51fc888ef41a7cebe8cfcfd061d7"} Apr 17 20:11:05.065968 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:05.065951 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:11:05.092029 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:05.091963 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" podStartSLOduration=1.9717454330000002 podStartE2EDuration="22.091944286s" podCreationTimestamp="2026-04-17 20:10:43 +0000 UTC" firstStartedPulling="2026-04-17 20:10:44.810546634 +0000 UTC m=+427.621961500" lastFinishedPulling="2026-04-17 20:11:04.930745487 +0000 UTC m=+447.742160353" observedRunningTime="2026-04-17 20:11:05.090153116 +0000 UTC m=+447.901568004" watchObservedRunningTime="2026-04-17 20:11:05.091944286 +0000 UTC m=+447.903359172" Apr 17 20:11:08.042603 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:08.042571 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-2lxmw" Apr 17 20:11:16.071726 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:16.071694 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-jv6nv" Apr 17 20:11:28.044670 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:11:28.044640 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-j28lg" Apr 17 20:12:13.761249 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:13.761208 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4"] Apr 17 20:12:13.764799 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:13.764753 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" Apr 17 20:12:13.768017 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:13.767980 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 20:12:13.768895 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:13.768876 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 20:12:13.769007 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:13.768896 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-cvr6x\"" Apr 17 20:12:13.771647 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:13.771619 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4"] Apr 17 20:12:13.833896 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:13.833847 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5p6g\" (UniqueName: \"kubernetes.io/projected/4ef53589-d963-4513-ad4d-32b56ddac0d6-kube-api-access-j5p6g\") pod \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" (UID: \"4ef53589-d963-4513-ad4d-32b56ddac0d6\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" Apr 17 20:12:13.935039 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:13.934992 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5p6g\" (UniqueName: \"kubernetes.io/projected/4ef53589-d963-4513-ad4d-32b56ddac0d6-kube-api-access-j5p6g\") pod \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" (UID: \"4ef53589-d963-4513-ad4d-32b56ddac0d6\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" Apr 17 20:12:13.944282 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:13.944255 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5p6g\" (UniqueName: \"kubernetes.io/projected/4ef53589-d963-4513-ad4d-32b56ddac0d6-kube-api-access-j5p6g\") pod \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" (UID: \"4ef53589-d963-4513-ad4d-32b56ddac0d6\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" Apr 17 20:12:14.076882 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:14.076793 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" Apr 17 20:12:14.208753 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:14.208730 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4"] Apr 17 20:12:14.211810 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:12:14.211758 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ef53589_d963_4513_ad4d_32b56ddac0d6.slice/crio-feccd521e3176bf7ab1385f01a93d9c0728dd2f8b498f56fc3ee71878fdff3c4 WatchSource:0}: Error finding container feccd521e3176bf7ab1385f01a93d9c0728dd2f8b498f56fc3ee71878fdff3c4: Status 404 returned error can't find the container with id feccd521e3176bf7ab1385f01a93d9c0728dd2f8b498f56fc3ee71878fdff3c4 Apr 17 20:12:14.277640 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:14.277589 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" event={"ID":"4ef53589-d963-4513-ad4d-32b56ddac0d6","Type":"ContainerStarted","Data":"feccd521e3176bf7ab1385f01a93d9c0728dd2f8b498f56fc3ee71878fdff3c4"} Apr 17 20:12:17.289718 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:17.289676 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" event={"ID":"4ef53589-d963-4513-ad4d-32b56ddac0d6","Type":"ContainerStarted","Data":"65ad147d7a868b6338277596d7228a6103cfa562cf65a45f4924b1236d9cfaa5"} Apr 17 20:12:17.290154 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:17.289812 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" Apr 17 20:12:17.306320 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:17.306255 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" podStartSLOduration=2.278033457 podStartE2EDuration="4.306234794s" podCreationTimestamp="2026-04-17 20:12:13 +0000 UTC" firstStartedPulling="2026-04-17 20:12:14.213891291 +0000 UTC m=+517.025306153" lastFinishedPulling="2026-04-17 20:12:16.242092624 +0000 UTC m=+519.053507490" observedRunningTime="2026-04-17 20:12:17.304838737 +0000 UTC m=+520.116253624" watchObservedRunningTime="2026-04-17 20:12:17.306234794 +0000 UTC m=+520.117649680" Apr 17 20:12:18.381981 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:18.381944 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz"] Apr 17 20:12:18.385640 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:18.385616 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" Apr 17 20:12:18.387872 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:18.387852 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-rbv24\"" Apr 17 20:12:18.399915 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:18.399885 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz"] Apr 17 20:12:18.470638 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:18.470600 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45tt2\" (UniqueName: \"kubernetes.io/projected/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84-kube-api-access-45tt2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" (UID: \"06c8e26b-e9f1-4c8d-a8a3-432c1d815c84\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" Apr 17 20:12:18.470638 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:18.470653 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" (UID: \"06c8e26b-e9f1-4c8d-a8a3-432c1d815c84\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" Apr 17 20:12:18.571616 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:18.571568 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45tt2\" (UniqueName: \"kubernetes.io/projected/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84-kube-api-access-45tt2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" (UID: \"06c8e26b-e9f1-4c8d-a8a3-432c1d815c84\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" Apr 17 20:12:18.571834 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:18.571626 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" (UID: \"06c8e26b-e9f1-4c8d-a8a3-432c1d815c84\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" Apr 17 20:12:18.572018 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:18.572000 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" (UID: \"06c8e26b-e9f1-4c8d-a8a3-432c1d815c84\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" Apr 17 20:12:18.579884 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:18.579862 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45tt2\" (UniqueName: \"kubernetes.io/projected/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84-kube-api-access-45tt2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" (UID: \"06c8e26b-e9f1-4c8d-a8a3-432c1d815c84\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" Apr 17 20:12:18.695493 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:18.695395 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" Apr 17 20:12:18.821051 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:18.821007 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz"] Apr 17 20:12:18.824048 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:12:18.824019 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06c8e26b_e9f1_4c8d_a8a3_432c1d815c84.slice/crio-4cf10493bf154f8f8f4f633511752e5849bcbca4eb686cc75551310bdb00051d WatchSource:0}: Error finding container 4cf10493bf154f8f8f4f633511752e5849bcbca4eb686cc75551310bdb00051d: Status 404 returned error can't find the container with id 4cf10493bf154f8f8f4f633511752e5849bcbca4eb686cc75551310bdb00051d Apr 17 20:12:19.297164 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:19.297123 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" event={"ID":"06c8e26b-e9f1-4c8d-a8a3-432c1d815c84","Type":"ContainerStarted","Data":"4cf10493bf154f8f8f4f633511752e5849bcbca4eb686cc75551310bdb00051d"} Apr 17 20:12:25.326458 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:25.326409 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" event={"ID":"06c8e26b-e9f1-4c8d-a8a3-432c1d815c84","Type":"ContainerStarted","Data":"07df6ec38c247a209e5f8f71457edeed931c94006b55150da8d2b707893f41d2"} Apr 17 20:12:25.326937 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:25.326568 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" Apr 17 20:12:25.352146 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:25.352086 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" podStartSLOduration=1.924679332 podStartE2EDuration="7.352069825s" podCreationTimestamp="2026-04-17 20:12:18 +0000 UTC" firstStartedPulling="2026-04-17 20:12:18.826456729 +0000 UTC m=+521.637871593" lastFinishedPulling="2026-04-17 20:12:24.253847224 +0000 UTC m=+527.065262086" observedRunningTime="2026-04-17 20:12:25.350555819 +0000 UTC m=+528.161970699" watchObservedRunningTime="2026-04-17 20:12:25.352069825 +0000 UTC m=+528.163484708" Apr 17 20:12:28.295365 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:28.295333 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" Apr 17 20:12:36.331971 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:36.331938 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" Apr 17 20:12:37.986378 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:37.986334 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz"] Apr 17 20:12:37.986801 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:37.986599 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" containerName="manager" containerID="cri-o://07df6ec38c247a209e5f8f71457edeed931c94006b55150da8d2b707893f41d2" gracePeriod=2 Apr 17 20:12:37.988863 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:37.988819 2567 status_manager.go:895] "Failed to get status for pod" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:37.989689 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:37.989637 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz"] Apr 17 20:12:38.012284 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.012257 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz"] Apr 17 20:12:38.012565 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.012552 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" containerName="manager" Apr 17 20:12:38.012608 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.012567 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" containerName="manager" Apr 17 20:12:38.012641 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.012612 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" containerName="manager" Apr 17 20:12:38.015469 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.015449 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" Apr 17 20:12:38.019168 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.019148 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4"] Apr 17 20:12:38.019462 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.019417 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" containerName="manager" containerID="cri-o://65ad147d7a868b6338277596d7228a6103cfa562cf65a45f4924b1236d9cfaa5" gracePeriod=2 Apr 17 20:12:38.028476 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.028451 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4"] Apr 17 20:12:38.030127 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.030105 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz"] Apr 17 20:12:38.051570 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.051531 2567 status_manager.go:895] "Failed to get status for pod" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.053303 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.053276 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4"] Apr 17 20:12:38.053596 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.053580 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" containerName="manager" Apr 17 20:12:38.053649 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.053606 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" containerName="manager" Apr 17 20:12:38.053689 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.053574 2567 status_manager.go:895] "Failed to get status for pod" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.053689 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.053681 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" containerName="manager" Apr 17 20:12:38.055474 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.055445 2567 status_manager.go:895] "Failed to get status for pod" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" err="pods \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.056510 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.056492 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4" Apr 17 20:12:38.058273 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.058237 2567 status_manager.go:895] "Failed to get status for pod" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" err="pods \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.060003 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.059974 2567 status_manager.go:895] "Failed to get status for pod" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.067635 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.067614 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4"] Apr 17 20:12:38.130166 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.130136 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-67zvz\" (UID: \"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" Apr 17 20:12:38.130284 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.130229 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mgm2\" (UniqueName: \"kubernetes.io/projected/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a-kube-api-access-6mgm2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-67zvz\" (UID: \"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" Apr 17 20:12:38.130284 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.130272 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4np8t\" (UniqueName: \"kubernetes.io/projected/46331141-894b-46d1-85b0-7e29c19ffc46-kube-api-access-4np8t\") pod \"limitador-operator-controller-manager-85c4996f8c-zs9n4\" (UID: \"46331141-894b-46d1-85b0-7e29c19ffc46\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4" Apr 17 20:12:38.231458 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.231426 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mgm2\" (UniqueName: \"kubernetes.io/projected/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a-kube-api-access-6mgm2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-67zvz\" (UID: \"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" Apr 17 20:12:38.231635 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.231484 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4np8t\" (UniqueName: \"kubernetes.io/projected/46331141-894b-46d1-85b0-7e29c19ffc46-kube-api-access-4np8t\") pod \"limitador-operator-controller-manager-85c4996f8c-zs9n4\" (UID: \"46331141-894b-46d1-85b0-7e29c19ffc46\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4" Apr 17 20:12:38.231635 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.231528 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-67zvz\" (UID: \"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" Apr 17 20:12:38.232053 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.232030 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-67zvz\" (UID: \"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" Apr 17 20:12:38.242839 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.242812 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4np8t\" (UniqueName: \"kubernetes.io/projected/46331141-894b-46d1-85b0-7e29c19ffc46-kube-api-access-4np8t\") pod \"limitador-operator-controller-manager-85c4996f8c-zs9n4\" (UID: \"46331141-894b-46d1-85b0-7e29c19ffc46\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4" Apr 17 20:12:38.242964 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.242949 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mgm2\" (UniqueName: \"kubernetes.io/projected/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a-kube-api-access-6mgm2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-67zvz\" (UID: \"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" Apr 17 20:12:38.255296 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.255278 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" Apr 17 20:12:38.257252 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.257226 2567 status_manager.go:895] "Failed to get status for pod" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.258354 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.258339 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" Apr 17 20:12:38.259307 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.259281 2567 status_manager.go:895] "Failed to get status for pod" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" err="pods \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.261180 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.261159 2567 status_manager.go:895] "Failed to get status for pod" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.263463 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.263444 2567 status_manager.go:895] "Failed to get status for pod" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" err="pods \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.332476 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.332452 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84-extensions-socket-volume\") pod \"06c8e26b-e9f1-4c8d-a8a3-432c1d815c84\" (UID: \"06c8e26b-e9f1-4c8d-a8a3-432c1d815c84\") " Apr 17 20:12:38.332573 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.332492 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5p6g\" (UniqueName: \"kubernetes.io/projected/4ef53589-d963-4513-ad4d-32b56ddac0d6-kube-api-access-j5p6g\") pod \"4ef53589-d963-4513-ad4d-32b56ddac0d6\" (UID: \"4ef53589-d963-4513-ad4d-32b56ddac0d6\") " Apr 17 20:12:38.332573 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.332538 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45tt2\" (UniqueName: \"kubernetes.io/projected/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84-kube-api-access-45tt2\") pod \"06c8e26b-e9f1-4c8d-a8a3-432c1d815c84\" (UID: \"06c8e26b-e9f1-4c8d-a8a3-432c1d815c84\") " Apr 17 20:12:38.332921 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.332898 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" (UID: "06c8e26b-e9f1-4c8d-a8a3-432c1d815c84"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:12:38.334604 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.334583 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84-kube-api-access-45tt2" (OuterVolumeSpecName: "kube-api-access-45tt2") pod "06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" (UID: "06c8e26b-e9f1-4c8d-a8a3-432c1d815c84"). InnerVolumeSpecName "kube-api-access-45tt2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:12:38.334659 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.334617 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef53589-d963-4513-ad4d-32b56ddac0d6-kube-api-access-j5p6g" (OuterVolumeSpecName: "kube-api-access-j5p6g") pod "4ef53589-d963-4513-ad4d-32b56ddac0d6" (UID: "4ef53589-d963-4513-ad4d-32b56ddac0d6"). InnerVolumeSpecName "kube-api-access-j5p6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:12:38.373208 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.373183 2567 generic.go:358] "Generic (PLEG): container finished" podID="4ef53589-d963-4513-ad4d-32b56ddac0d6" containerID="65ad147d7a868b6338277596d7228a6103cfa562cf65a45f4924b1236d9cfaa5" exitCode=0 Apr 17 20:12:38.373306 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.373234 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" Apr 17 20:12:38.373306 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.373279 2567 scope.go:117] "RemoveContainer" containerID="65ad147d7a868b6338277596d7228a6103cfa562cf65a45f4924b1236d9cfaa5" Apr 17 20:12:38.374500 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.374470 2567 generic.go:358] "Generic (PLEG): container finished" podID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" containerID="07df6ec38c247a209e5f8f71457edeed931c94006b55150da8d2b707893f41d2" exitCode=0 Apr 17 20:12:38.374591 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.374513 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" Apr 17 20:12:38.375887 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.375858 2567 status_manager.go:895] "Failed to get status for pod" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.378077 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.378054 2567 status_manager.go:895] "Failed to get status for pod" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" err="pods \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.380018 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.379995 2567 status_manager.go:895] "Failed to get status for pod" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" err="pods \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.382070 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.381927 2567 status_manager.go:895] "Failed to get status for pod" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.382070 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.381977 2567 scope.go:117] "RemoveContainer" containerID="65ad147d7a868b6338277596d7228a6103cfa562cf65a45f4924b1236d9cfaa5" Apr 17 20:12:38.382295 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:12:38.382271 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ad147d7a868b6338277596d7228a6103cfa562cf65a45f4924b1236d9cfaa5\": container with ID starting with 65ad147d7a868b6338277596d7228a6103cfa562cf65a45f4924b1236d9cfaa5 not found: ID does not exist" containerID="65ad147d7a868b6338277596d7228a6103cfa562cf65a45f4924b1236d9cfaa5" Apr 17 20:12:38.382365 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.382306 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ad147d7a868b6338277596d7228a6103cfa562cf65a45f4924b1236d9cfaa5"} err="failed to get container status \"65ad147d7a868b6338277596d7228a6103cfa562cf65a45f4924b1236d9cfaa5\": rpc error: code = NotFound desc = could not find container \"65ad147d7a868b6338277596d7228a6103cfa562cf65a45f4924b1236d9cfaa5\": container with ID starting with 65ad147d7a868b6338277596d7228a6103cfa562cf65a45f4924b1236d9cfaa5 not found: ID does not exist" Apr 17 20:12:38.382365 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.382330 2567 scope.go:117] "RemoveContainer" containerID="07df6ec38c247a209e5f8f71457edeed931c94006b55150da8d2b707893f41d2" Apr 17 20:12:38.386449 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.386427 2567 status_manager.go:895] "Failed to get status for pod" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.388233 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.388213 2567 status_manager.go:895] "Failed to get status for pod" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" err="pods \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.389607 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.389595 2567 scope.go:117] "RemoveContainer" containerID="07df6ec38c247a209e5f8f71457edeed931c94006b55150da8d2b707893f41d2" Apr 17 20:12:38.390071 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:12:38.390045 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07df6ec38c247a209e5f8f71457edeed931c94006b55150da8d2b707893f41d2\": container with ID starting with 07df6ec38c247a209e5f8f71457edeed931c94006b55150da8d2b707893f41d2 not found: ID does not exist" containerID="07df6ec38c247a209e5f8f71457edeed931c94006b55150da8d2b707893f41d2" Apr 17 20:12:38.390116 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.390082 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07df6ec38c247a209e5f8f71457edeed931c94006b55150da8d2b707893f41d2"} err="failed to get container status \"07df6ec38c247a209e5f8f71457edeed931c94006b55150da8d2b707893f41d2\": rpc error: code = NotFound desc = could not find container \"07df6ec38c247a209e5f8f71457edeed931c94006b55150da8d2b707893f41d2\": container with ID starting with 07df6ec38c247a209e5f8f71457edeed931c94006b55150da8d2b707893f41d2 not found: ID does not exist" Apr 17 20:12:38.390153 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.390116 2567 status_manager.go:895] "Failed to get status for pod" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" err="pods \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.391898 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.391877 2567 status_manager.go:895] "Failed to get status for pod" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:38.429196 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.429174 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" Apr 17 20:12:38.433090 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.433067 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84-extensions-socket-volume\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:12:38.433152 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.433100 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j5p6g\" (UniqueName: \"kubernetes.io/projected/4ef53589-d963-4513-ad4d-32b56ddac0d6-kube-api-access-j5p6g\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:12:38.433152 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.433119 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-45tt2\" (UniqueName: \"kubernetes.io/projected/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84-kube-api-access-45tt2\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:12:38.435910 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.435891 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4" Apr 17 20:12:38.563547 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.563290 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz"] Apr 17 20:12:38.566948 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:12:38.566914 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb584e2e_5a22_4f46_bbae_dcfd31b6c86a.slice/crio-c0bf21a24c4f904986f60f5c6d9549636d6093ef6beae7bf33abdec259d1c35e WatchSource:0}: Error finding container c0bf21a24c4f904986f60f5c6d9549636d6093ef6beae7bf33abdec259d1c35e: Status 404 returned error can't find the container with id c0bf21a24c4f904986f60f5c6d9549636d6093ef6beae7bf33abdec259d1c35e Apr 17 20:12:38.599088 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:38.599033 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4"] Apr 17 20:12:38.601238 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:12:38.601211 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46331141_894b_46d1_85b0_7e29c19ffc46.slice/crio-9690dc4efcb01d9bf1d16ee1ab0898e6913b91ede0886975f853384a7cb01a37 WatchSource:0}: Error finding container 9690dc4efcb01d9bf1d16ee1ab0898e6913b91ede0886975f853384a7cb01a37: Status 404 returned error can't find the container with id 9690dc4efcb01d9bf1d16ee1ab0898e6913b91ede0886975f853384a7cb01a37 Apr 17 20:12:39.380425 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.380338 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4" event={"ID":"46331141-894b-46d1-85b0-7e29c19ffc46","Type":"ContainerStarted","Data":"9f768b1af586e7add69e29f42942c4a56618655c5968df7dad1173cc8e8ba5b1"} Apr 17 20:12:39.380425 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.380379 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4" event={"ID":"46331141-894b-46d1-85b0-7e29c19ffc46","Type":"ContainerStarted","Data":"9690dc4efcb01d9bf1d16ee1ab0898e6913b91ede0886975f853384a7cb01a37"} Apr 17 20:12:39.380926 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.380474 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4" Apr 17 20:12:39.382461 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.382437 2567 status_manager.go:895] "Failed to get status for pod" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:39.382584 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.382550 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" event={"ID":"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a","Type":"ContainerStarted","Data":"49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8"} Apr 17 20:12:39.382584 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.382575 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" event={"ID":"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a","Type":"ContainerStarted","Data":"c0bf21a24c4f904986f60f5c6d9549636d6093ef6beae7bf33abdec259d1c35e"} Apr 17 20:12:39.382684 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.382674 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" Apr 17 20:12:39.398443 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.398399 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4" podStartSLOduration=1.398387309 podStartE2EDuration="1.398387309s" podCreationTimestamp="2026-04-17 20:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:12:39.39707619 +0000 UTC m=+542.208491084" watchObservedRunningTime="2026-04-17 20:12:39.398387309 +0000 UTC m=+542.209802194" Apr 17 20:12:39.398981 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.398952 2567 status_manager.go:895] "Failed to get status for pod" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" err="pods \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:39.400732 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.400712 2567 status_manager.go:895] "Failed to get status for pod" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8dnk4" err="pods \"limitador-operator-controller-manager-85c4996f8c-8dnk4\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:39.417377 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.417330 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" podStartSLOduration=2.4173200440000002 podStartE2EDuration="2.417320044s" podCreationTimestamp="2026-04-17 20:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:12:39.416446982 +0000 UTC m=+542.227861864" watchObservedRunningTime="2026-04-17 20:12:39.417320044 +0000 UTC m=+542.228734928" Apr 17 20:12:39.417997 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.417971 2567 status_manager.go:895] "Failed to get status for pod" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-rdbqz" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-rdbqz\" is forbidden: User \"system:node:ip-10-0-134-158.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-158.ec2.internal' and this object" Apr 17 20:12:39.801396 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.801361 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06c8e26b-e9f1-4c8d-a8a3-432c1d815c84" path="/var/lib/kubelet/pods/06c8e26b-e9f1-4c8d-a8a3-432c1d815c84/volumes" Apr 17 20:12:39.801666 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:39.801654 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef53589-d963-4513-ad4d-32b56ddac0d6" path="/var/lib/kubelet/pods/4ef53589-d963-4513-ad4d-32b56ddac0d6/volumes" Apr 17 20:12:50.388391 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:50.388347 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-zs9n4" Apr 17 20:12:50.388913 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:12:50.388419 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" Apr 17 20:13:08.395099 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.395058 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz"] Apr 17 20:13:08.395493 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.395323 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" podUID="eb584e2e-5a22-4f46-bbae-dcfd31b6c86a" containerName="manager" containerID="cri-o://49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8" gracePeriod=10 Apr 17 20:13:08.624536 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.623165 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s"] Apr 17 20:13:08.629786 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.626988 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" Apr 17 20:13:08.629786 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.629090 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s"] Apr 17 20:13:08.656432 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.656410 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" Apr 17 20:13:08.768606 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.768569 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mgm2\" (UniqueName: \"kubernetes.io/projected/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a-kube-api-access-6mgm2\") pod \"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a\" (UID: \"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a\") " Apr 17 20:13:08.768817 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.768643 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a-extensions-socket-volume\") pod \"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a\" (UID: \"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a\") " Apr 17 20:13:08.768905 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.768820 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8267d39c-de55-4fcc-9f78-ae472ddb5ff4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-kb42s\" (UID: \"8267d39c-de55-4fcc-9f78-ae472ddb5ff4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" Apr 17 20:13:08.768968 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.768925 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5hct\" (UniqueName: \"kubernetes.io/projected/8267d39c-de55-4fcc-9f78-ae472ddb5ff4-kube-api-access-c5hct\") pod \"kuadrant-operator-controller-manager-55c7f4c975-kb42s\" (UID: \"8267d39c-de55-4fcc-9f78-ae472ddb5ff4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" Apr 17 20:13:08.769103 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.769079 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "eb584e2e-5a22-4f46-bbae-dcfd31b6c86a" (UID: "eb584e2e-5a22-4f46-bbae-dcfd31b6c86a"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:13:08.770524 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.770499 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a-kube-api-access-6mgm2" (OuterVolumeSpecName: "kube-api-access-6mgm2") pod "eb584e2e-5a22-4f46-bbae-dcfd31b6c86a" (UID: "eb584e2e-5a22-4f46-bbae-dcfd31b6c86a"). InnerVolumeSpecName "kube-api-access-6mgm2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:13:08.869659 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.869619 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5hct\" (UniqueName: \"kubernetes.io/projected/8267d39c-de55-4fcc-9f78-ae472ddb5ff4-kube-api-access-c5hct\") pod \"kuadrant-operator-controller-manager-55c7f4c975-kb42s\" (UID: \"8267d39c-de55-4fcc-9f78-ae472ddb5ff4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" Apr 17 20:13:08.869864 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.869687 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8267d39c-de55-4fcc-9f78-ae472ddb5ff4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-kb42s\" (UID: \"8267d39c-de55-4fcc-9f78-ae472ddb5ff4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" Apr 17 20:13:08.869864 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.869780 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6mgm2\" (UniqueName: \"kubernetes.io/projected/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a-kube-api-access-6mgm2\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:13:08.869864 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.869796 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a-extensions-socket-volume\") on node \"ip-10-0-134-158.ec2.internal\" DevicePath \"\"" Apr 17 20:13:08.870082 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.870062 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8267d39c-de55-4fcc-9f78-ae472ddb5ff4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-kb42s\" (UID: \"8267d39c-de55-4fcc-9f78-ae472ddb5ff4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" Apr 17 20:13:08.877724 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.877693 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5hct\" (UniqueName: \"kubernetes.io/projected/8267d39c-de55-4fcc-9f78-ae472ddb5ff4-kube-api-access-c5hct\") pod \"kuadrant-operator-controller-manager-55c7f4c975-kb42s\" (UID: \"8267d39c-de55-4fcc-9f78-ae472ddb5ff4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" Apr 17 20:13:08.955253 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:08.955157 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" Apr 17 20:13:09.082972 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.082948 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s"] Apr 17 20:13:09.088706 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:13:09.088679 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8267d39c_de55_4fcc_9f78_ae472ddb5ff4.slice/crio-2a10c6bdd3ef79cf524708f75c98e18162ce4d6b9ac3cac67c1ad39e96dad7a0 WatchSource:0}: Error finding container 2a10c6bdd3ef79cf524708f75c98e18162ce4d6b9ac3cac67c1ad39e96dad7a0: Status 404 returned error can't find the container with id 2a10c6bdd3ef79cf524708f75c98e18162ce4d6b9ac3cac67c1ad39e96dad7a0 Apr 17 20:13:09.477724 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.477691 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" event={"ID":"8267d39c-de55-4fcc-9f78-ae472ddb5ff4","Type":"ContainerStarted","Data":"b2904c019d011d1c3531570450049a2f9d33d34e912e31f539b9529e667828fc"} Apr 17 20:13:09.477724 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.477725 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" event={"ID":"8267d39c-de55-4fcc-9f78-ae472ddb5ff4","Type":"ContainerStarted","Data":"2a10c6bdd3ef79cf524708f75c98e18162ce4d6b9ac3cac67c1ad39e96dad7a0"} Apr 17 20:13:09.478219 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.477758 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" Apr 17 20:13:09.478942 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.478919 2567 generic.go:358] "Generic (PLEG): container finished" podID="eb584e2e-5a22-4f46-bbae-dcfd31b6c86a" containerID="49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8" exitCode=0 Apr 17 20:13:09.479023 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.478970 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" event={"ID":"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a","Type":"ContainerDied","Data":"49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8"} Apr 17 20:13:09.479023 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.478974 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" Apr 17 20:13:09.479023 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.478991 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz" event={"ID":"eb584e2e-5a22-4f46-bbae-dcfd31b6c86a","Type":"ContainerDied","Data":"c0bf21a24c4f904986f60f5c6d9549636d6093ef6beae7bf33abdec259d1c35e"} Apr 17 20:13:09.479023 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.479005 2567 scope.go:117] "RemoveContainer" containerID="49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8" Apr 17 20:13:09.486673 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.486565 2567 scope.go:117] "RemoveContainer" containerID="49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8" Apr 17 20:13:09.486902 ip-10-0-134-158 kubenswrapper[2567]: E0417 20:13:09.486884 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8\": container with ID starting with 49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8 not found: ID does not exist" containerID="49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8" Apr 17 20:13:09.486970 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.486910 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8"} err="failed to get container status \"49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8\": rpc error: code = NotFound desc = could not find container \"49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8\": container with ID starting with 49e06c986b9e48aebafb515d9371720123f2f58974612ae8759190cbb9b6edd8 not found: ID does not exist" Apr 17 20:13:09.495612 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.495568 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" podStartSLOduration=1.4955573690000001 podStartE2EDuration="1.495557369s" podCreationTimestamp="2026-04-17 20:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:13:09.494868857 +0000 UTC m=+572.306283744" watchObservedRunningTime="2026-04-17 20:13:09.495557369 +0000 UTC m=+572.306972253" Apr 17 20:13:09.511531 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.511509 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz"] Apr 17 20:13:09.514884 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.514864 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-67zvz"] Apr 17 20:13:09.802476 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:09.802399 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb584e2e-5a22-4f46-bbae-dcfd31b6c86a" path="/var/lib/kubelet/pods/eb584e2e-5a22-4f46-bbae-dcfd31b6c86a/volumes" Apr 17 20:13:20.485629 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:20.485594 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kb42s" Apr 17 20:13:44.475788 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.475736 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-7qw5s"] Apr 17 20:13:44.476177 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.476074 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb584e2e-5a22-4f46-bbae-dcfd31b6c86a" containerName="manager" Apr 17 20:13:44.476177 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.476086 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb584e2e-5a22-4f46-bbae-dcfd31b6c86a" containerName="manager" Apr 17 20:13:44.476177 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.476139 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb584e2e-5a22-4f46-bbae-dcfd31b6c86a" containerName="manager" Apr 17 20:13:44.480343 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.480321 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-7qw5s" Apr 17 20:13:44.483327 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.483305 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-8nqbl\"" Apr 17 20:13:44.483451 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.483363 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 20:13:44.487037 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.487003 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-7qw5s"] Apr 17 20:13:44.538101 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.538069 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmnx9\" (UniqueName: \"kubernetes.io/projected/89cf816c-9dc7-43ec-aa96-5d4f109e720e-kube-api-access-rmnx9\") pod \"postgres-868db5846d-7qw5s\" (UID: \"89cf816c-9dc7-43ec-aa96-5d4f109e720e\") " pod="opendatahub/postgres-868db5846d-7qw5s" Apr 17 20:13:44.538285 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.538126 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/89cf816c-9dc7-43ec-aa96-5d4f109e720e-data\") pod \"postgres-868db5846d-7qw5s\" (UID: \"89cf816c-9dc7-43ec-aa96-5d4f109e720e\") " pod="opendatahub/postgres-868db5846d-7qw5s" Apr 17 20:13:44.638560 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.638523 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/89cf816c-9dc7-43ec-aa96-5d4f109e720e-data\") pod \"postgres-868db5846d-7qw5s\" (UID: \"89cf816c-9dc7-43ec-aa96-5d4f109e720e\") " pod="opendatahub/postgres-868db5846d-7qw5s" Apr 17 20:13:44.638741 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.638597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmnx9\" (UniqueName: \"kubernetes.io/projected/89cf816c-9dc7-43ec-aa96-5d4f109e720e-kube-api-access-rmnx9\") pod \"postgres-868db5846d-7qw5s\" (UID: \"89cf816c-9dc7-43ec-aa96-5d4f109e720e\") " pod="opendatahub/postgres-868db5846d-7qw5s" Apr 17 20:13:44.638991 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.638968 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/89cf816c-9dc7-43ec-aa96-5d4f109e720e-data\") pod \"postgres-868db5846d-7qw5s\" (UID: \"89cf816c-9dc7-43ec-aa96-5d4f109e720e\") " pod="opendatahub/postgres-868db5846d-7qw5s" Apr 17 20:13:44.647207 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.647175 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmnx9\" (UniqueName: \"kubernetes.io/projected/89cf816c-9dc7-43ec-aa96-5d4f109e720e-kube-api-access-rmnx9\") pod \"postgres-868db5846d-7qw5s\" (UID: \"89cf816c-9dc7-43ec-aa96-5d4f109e720e\") " pod="opendatahub/postgres-868db5846d-7qw5s" Apr 17 20:13:44.793129 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.793095 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-7qw5s" Apr 17 20:13:44.918059 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:44.918032 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-7qw5s"] Apr 17 20:13:44.920850 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:13:44.920816 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89cf816c_9dc7_43ec_aa96_5d4f109e720e.slice/crio-edc0c079710fde941597017fce6336395f27772219bb118687959f19d6f3d0cd WatchSource:0}: Error finding container edc0c079710fde941597017fce6336395f27772219bb118687959f19d6f3d0cd: Status 404 returned error can't find the container with id edc0c079710fde941597017fce6336395f27772219bb118687959f19d6f3d0cd Apr 17 20:13:45.599274 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:45.599242 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-7qw5s" event={"ID":"89cf816c-9dc7-43ec-aa96-5d4f109e720e","Type":"ContainerStarted","Data":"edc0c079710fde941597017fce6336395f27772219bb118687959f19d6f3d0cd"} Apr 17 20:13:50.618319 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:50.618280 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-7qw5s" event={"ID":"89cf816c-9dc7-43ec-aa96-5d4f109e720e","Type":"ContainerStarted","Data":"1576f459eed1eaa369fac6e581e1490ad13fa9bc71c9a2bc0abef88f90e66b86"} Apr 17 20:13:50.618714 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:50.618401 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-7qw5s" Apr 17 20:13:50.635017 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:50.634957 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-7qw5s" podStartSLOduration=1.112057389 podStartE2EDuration="6.634937358s" podCreationTimestamp="2026-04-17 20:13:44 +0000 UTC" firstStartedPulling="2026-04-17 20:13:44.922451247 +0000 UTC m=+607.733866113" lastFinishedPulling="2026-04-17 20:13:50.445331205 +0000 UTC m=+613.256746082" observedRunningTime="2026-04-17 20:13:50.632319635 +0000 UTC m=+613.443734520" watchObservedRunningTime="2026-04-17 20:13:50.634937358 +0000 UTC m=+613.446352243" Apr 17 20:13:56.649192 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:13:56.649123 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-7qw5s" Apr 17 20:14:44.382589 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.382557 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-5cd8994dd5-726v8"] Apr 17 20:14:44.385994 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.385977 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5cd8994dd5-726v8" Apr 17 20:14:44.388494 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.388474 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 20:14:44.388606 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.388477 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-qrvkh\"" Apr 17 20:14:44.389340 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.389319 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 20:14:44.394113 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.394094 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5cd8994dd5-726v8"] Apr 17 20:14:44.501435 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.501406 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chx9j\" (UniqueName: \"kubernetes.io/projected/3834b8cf-c838-46ed-bcb2-223ebb63bc97-kube-api-access-chx9j\") pod \"maas-api-5cd8994dd5-726v8\" (UID: \"3834b8cf-c838-46ed-bcb2-223ebb63bc97\") " pod="opendatahub/maas-api-5cd8994dd5-726v8" Apr 17 20:14:44.501591 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.501473 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3834b8cf-c838-46ed-bcb2-223ebb63bc97-maas-api-tls\") pod \"maas-api-5cd8994dd5-726v8\" (UID: \"3834b8cf-c838-46ed-bcb2-223ebb63bc97\") " pod="opendatahub/maas-api-5cd8994dd5-726v8" Apr 17 20:14:44.602760 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.602730 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3834b8cf-c838-46ed-bcb2-223ebb63bc97-maas-api-tls\") pod \"maas-api-5cd8994dd5-726v8\" (UID: \"3834b8cf-c838-46ed-bcb2-223ebb63bc97\") " pod="opendatahub/maas-api-5cd8994dd5-726v8" Apr 17 20:14:44.602909 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.602793 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chx9j\" (UniqueName: \"kubernetes.io/projected/3834b8cf-c838-46ed-bcb2-223ebb63bc97-kube-api-access-chx9j\") pod \"maas-api-5cd8994dd5-726v8\" (UID: \"3834b8cf-c838-46ed-bcb2-223ebb63bc97\") " pod="opendatahub/maas-api-5cd8994dd5-726v8" Apr 17 20:14:44.605125 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.605096 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3834b8cf-c838-46ed-bcb2-223ebb63bc97-maas-api-tls\") pod \"maas-api-5cd8994dd5-726v8\" (UID: \"3834b8cf-c838-46ed-bcb2-223ebb63bc97\") " pod="opendatahub/maas-api-5cd8994dd5-726v8" Apr 17 20:14:44.610531 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.610505 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chx9j\" (UniqueName: \"kubernetes.io/projected/3834b8cf-c838-46ed-bcb2-223ebb63bc97-kube-api-access-chx9j\") pod \"maas-api-5cd8994dd5-726v8\" (UID: \"3834b8cf-c838-46ed-bcb2-223ebb63bc97\") " pod="opendatahub/maas-api-5cd8994dd5-726v8" Apr 17 20:14:44.696486 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.696426 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5cd8994dd5-726v8" Apr 17 20:14:44.820525 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:44.820494 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5cd8994dd5-726v8"] Apr 17 20:14:44.823621 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:14:44.823594 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3834b8cf_c838_46ed_bcb2_223ebb63bc97.slice/crio-848eab0e1e38a73d7b6857708d03a81b28df33293e75b45a44ecff5cd3ecc307 WatchSource:0}: Error finding container 848eab0e1e38a73d7b6857708d03a81b28df33293e75b45a44ecff5cd3ecc307: Status 404 returned error can't find the container with id 848eab0e1e38a73d7b6857708d03a81b28df33293e75b45a44ecff5cd3ecc307 Apr 17 20:14:45.794712 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:45.794675 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5cd8994dd5-726v8" event={"ID":"3834b8cf-c838-46ed-bcb2-223ebb63bc97","Type":"ContainerStarted","Data":"848eab0e1e38a73d7b6857708d03a81b28df33293e75b45a44ecff5cd3ecc307"} Apr 17 20:14:47.802694 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:47.802655 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5cd8994dd5-726v8" event={"ID":"3834b8cf-c838-46ed-bcb2-223ebb63bc97","Type":"ContainerStarted","Data":"1a510c5bcbc10ab26e4d3e6a94361c3e8ad0f2d8c4de183eb08ab58927262d59"} Apr 17 20:14:47.803140 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:47.802731 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-5cd8994dd5-726v8" Apr 17 20:14:47.831180 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:47.831132 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-5cd8994dd5-726v8" podStartSLOduration=1.474755918 podStartE2EDuration="3.831116685s" podCreationTimestamp="2026-04-17 20:14:44 +0000 UTC" firstStartedPulling="2026-04-17 20:14:44.825062887 +0000 UTC m=+667.636477752" lastFinishedPulling="2026-04-17 20:14:47.181423653 +0000 UTC m=+669.992838519" observedRunningTime="2026-04-17 20:14:47.83028086 +0000 UTC m=+670.641695746" watchObservedRunningTime="2026-04-17 20:14:47.831116685 +0000 UTC m=+670.642531569" Apr 17 20:14:53.812029 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:14:53.812003 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-5cd8994dd5-726v8" Apr 17 20:19:35.500046 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:35.500009 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-j28lg_3b7a3096-84cc-4ab5-a245-b553e829d363/manager/0.log" Apr 17 20:19:35.601320 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:35.601290 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-5cd8994dd5-726v8_3834b8cf-c838-46ed-bcb2-223ebb63bc97/maas-api/0.log" Apr 17 20:19:35.824664 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:35.824582 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-jv6nv_127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138/manager/2.log" Apr 17 20:19:36.166890 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:36.166807 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5_322436b5-ab0a-49e8-a03a-275c8cf8bb07/manager/0.log" Apr 17 20:19:36.276460 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:36.276429 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-7qw5s_89cf816c-9dc7-43ec-aa96-5d4f109e720e/postgres/0.log" Apr 17 20:19:38.009990 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:38.009960 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-kb42s_8267d39c-de55-4fcc-9f78-ae472ddb5ff4/manager/0.log" Apr 17 20:19:38.223282 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:38.223245 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-zs9n4_46331141-894b-46d1-85b0-7e29c19ffc46/manager/0.log" Apr 17 20:19:38.671195 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:38.671150 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-szktq_5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1/discovery/0.log" Apr 17 20:19:38.776579 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:38.776547 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6568cc58bc-nccqs_d1acc34f-386a-4d98-a4ae-63572849c747/kube-auth-proxy/0.log" Apr 17 20:19:46.486383 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:46.486349 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dq6ll_7b977d53-172d-4a66-8807-758a1e1abc45/global-pull-secret-syncer/0.log" Apr 17 20:19:46.644824 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:46.644791 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6sb8k_243940a2-412a-4019-b966-f66af5d78985/konnectivity-agent/0.log" Apr 17 20:19:46.740072 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:46.739986 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-158.ec2.internal_51b4526a209496dd9377d4f989eaa37c/haproxy/0.log" Apr 17 20:19:51.537807 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:51.537777 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-kb42s_8267d39c-de55-4fcc-9f78-ae472ddb5ff4/manager/0.log" Apr 17 20:19:51.594262 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:51.594235 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-zs9n4_46331141-894b-46d1-85b0-7e29c19ffc46/manager/0.log" Apr 17 20:19:53.375496 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:53.375469 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k27c8_8c22a5a8-04ab-4e78-8ac1-d3248878d68e/node-exporter/0.log" Apr 17 20:19:53.396318 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:53.396293 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k27c8_8c22a5a8-04ab-4e78-8ac1-d3248878d68e/kube-rbac-proxy/0.log" Apr 17 20:19:53.418244 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:53.418217 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k27c8_8c22a5a8-04ab-4e78-8ac1-d3248878d68e/init-textfile/0.log" Apr 17 20:19:55.273984 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.273895 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4"] Apr 17 20:19:55.277142 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.277122 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.279638 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.279612 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrnl5\"/\"openshift-service-ca.crt\"" Apr 17 20:19:55.279638 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.279636 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrnl5\"/\"kube-root-ca.crt\"" Apr 17 20:19:55.280494 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.280481 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jrnl5\"/\"default-dockercfg-fxgz8\"" Apr 17 20:19:55.288926 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.288900 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4"] Apr 17 20:19:55.360354 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.360317 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4816e67d-10f5-443b-91b0-783d6becd659-podres\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.360354 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.360356 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4816e67d-10f5-443b-91b0-783d6becd659-lib-modules\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.360621 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.360378 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4816e67d-10f5-443b-91b0-783d6becd659-sys\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.360621 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.360504 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfjwc\" (UniqueName: \"kubernetes.io/projected/4816e67d-10f5-443b-91b0-783d6becd659-kube-api-access-bfjwc\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.360621 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.360545 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4816e67d-10f5-443b-91b0-783d6becd659-proc\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.461263 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.461217 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4816e67d-10f5-443b-91b0-783d6becd659-podres\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.461443 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.461268 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4816e67d-10f5-443b-91b0-783d6becd659-lib-modules\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.461443 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.461299 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4816e67d-10f5-443b-91b0-783d6becd659-sys\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.461443 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.461341 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfjwc\" (UniqueName: \"kubernetes.io/projected/4816e67d-10f5-443b-91b0-783d6becd659-kube-api-access-bfjwc\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.461443 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.461369 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4816e67d-10f5-443b-91b0-783d6becd659-proc\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.461443 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.461411 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4816e67d-10f5-443b-91b0-783d6becd659-podres\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.461443 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.461416 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4816e67d-10f5-443b-91b0-783d6becd659-lib-modules\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.461726 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.461418 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4816e67d-10f5-443b-91b0-783d6becd659-sys\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.461726 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.461482 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4816e67d-10f5-443b-91b0-783d6becd659-proc\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.469787 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.469741 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfjwc\" (UniqueName: \"kubernetes.io/projected/4816e67d-10f5-443b-91b0-783d6becd659-kube-api-access-bfjwc\") pod \"perf-node-gather-daemonset-vh2r4\" (UID: \"4816e67d-10f5-443b-91b0-783d6becd659\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.587199 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.587102 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:55.718174 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.718142 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4"] Apr 17 20:19:55.720608 ip-10-0-134-158 kubenswrapper[2567]: W0417 20:19:55.720577 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4816e67d_10f5_443b_91b0_783d6becd659.slice/crio-e4043cd1cb909dc35d3467967fb5834c3bae0219146e4400485a2355394ced42 WatchSource:0}: Error finding container e4043cd1cb909dc35d3467967fb5834c3bae0219146e4400485a2355394ced42: Status 404 returned error can't find the container with id e4043cd1cb909dc35d3467967fb5834c3bae0219146e4400485a2355394ced42 Apr 17 20:19:55.722118 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.722103 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:19:55.816467 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:55.816425 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" event={"ID":"4816e67d-10f5-443b-91b0-783d6becd659","Type":"ContainerStarted","Data":"e4043cd1cb909dc35d3467967fb5834c3bae0219146e4400485a2355394ced42"} Apr 17 20:19:56.821037 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:56.821002 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" event={"ID":"4816e67d-10f5-443b-91b0-783d6becd659","Type":"ContainerStarted","Data":"5d0e0c8fa33f18e06df320dec23e73201565049e96bfb0cf98d5314891decdee"} Apr 17 20:19:56.821454 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:56.821154 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:19:56.836547 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:56.836496 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" podStartSLOduration=1.836477415 podStartE2EDuration="1.836477415s" podCreationTimestamp="2026-04-17 20:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:19:56.83499129 +0000 UTC m=+979.646406177" watchObservedRunningTime="2026-04-17 20:19:56.836477415 +0000 UTC m=+979.647892279" Apr 17 20:19:57.505367 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:57.505338 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-755zw_4ff6857d-6533-480b-ba95-f01666563ed0/dns/0.log" Apr 17 20:19:57.526135 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:57.526110 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-755zw_4ff6857d-6533-480b-ba95-f01666563ed0/kube-rbac-proxy/0.log" Apr 17 20:19:57.688349 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:57.688323 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m59bq_153e1d90-2d3e-4083-88f9-781771f16266/dns-node-resolver/0.log" Apr 17 20:19:58.181683 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:58.181647 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cgt9r_0286d3b0-9e6f-498c-b50c-69d5149b3f0d/node-ca/0.log" Apr 17 20:19:59.040963 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:59.040928 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-szktq_5f29dc75-1dbd-46e0-98ea-24ee2a6bc2f1/discovery/0.log" Apr 17 20:19:59.060233 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:59.060210 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6568cc58bc-nccqs_d1acc34f-386a-4d98-a4ae-63572849c747/kube-auth-proxy/0.log" Apr 17 20:19:59.673732 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:19:59.673702 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zmjhn_84ab49f7-96ea-40fb-b996-8b5492b23d01/serve-healthcheck-canary/0.log" Apr 17 20:20:00.256066 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:00.256026 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x5wz5_22c87446-981f-4f9a-8661-7b204afd155c/kube-rbac-proxy/0.log" Apr 17 20:20:00.277293 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:00.277266 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x5wz5_22c87446-981f-4f9a-8661-7b204afd155c/exporter/0.log" Apr 17 20:20:00.297399 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:00.297378 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x5wz5_22c87446-981f-4f9a-8661-7b204afd155c/extractor/0.log" Apr 17 20:20:02.105553 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:02.105521 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-j28lg_3b7a3096-84cc-4ab5-a245-b553e829d363/manager/0.log" Apr 17 20:20:02.126672 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:02.126641 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-5cd8994dd5-726v8_3834b8cf-c838-46ed-bcb2-223ebb63bc97/maas-api/0.log" Apr 17 20:20:02.194399 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:02.194371 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-jv6nv_127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138/manager/1.log" Apr 17 20:20:02.204891 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:02.204863 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-jv6nv_127cbcb7-41bb-4f5e-8e0c-cb8a3dffe138/manager/2.log" Apr 17 20:20:02.294475 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:02.294441 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6bcb6fdd5f-xtfg5_322436b5-ab0a-49e8-a03a-275c8cf8bb07/manager/0.log" Apr 17 20:20:02.314173 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:02.314147 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-7qw5s_89cf816c-9dc7-43ec-aa96-5d4f109e720e/postgres/0.log" Apr 17 20:20:02.834112 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:02.834082 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-vh2r4" Apr 17 20:20:03.458476 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:03.458443 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-ldl6p_3975a812-dbe2-47f2-92e3-d5146537c825/openshift-lws-operator/0.log" Apr 17 20:20:09.226815 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:09.226780 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-78p7f_3abe62da-aef2-4ef2-85a2-278e4f8fe4c1/kube-multus/0.log" Apr 17 20:20:09.690884 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:09.690801 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q4p8s_a4667d02-88e0-4ffd-a42f-77c06bdf9c21/kube-multus-additional-cni-plugins/0.log" Apr 17 20:20:09.713308 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:09.713279 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q4p8s_a4667d02-88e0-4ffd-a42f-77c06bdf9c21/egress-router-binary-copy/0.log" Apr 17 20:20:09.735943 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:09.735918 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q4p8s_a4667d02-88e0-4ffd-a42f-77c06bdf9c21/cni-plugins/0.log" Apr 17 20:20:09.757634 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:09.757608 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q4p8s_a4667d02-88e0-4ffd-a42f-77c06bdf9c21/bond-cni-plugin/0.log" Apr 17 20:20:09.778485 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:09.778452 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q4p8s_a4667d02-88e0-4ffd-a42f-77c06bdf9c21/routeoverride-cni/0.log" Apr 17 20:20:09.800178 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:09.800151 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q4p8s_a4667d02-88e0-4ffd-a42f-77c06bdf9c21/whereabouts-cni-bincopy/0.log" Apr 17 20:20:09.823576 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:09.823545 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q4p8s_a4667d02-88e0-4ffd-a42f-77c06bdf9c21/whereabouts-cni/0.log" Apr 17 20:20:09.958425 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:09.958341 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6vrjk_3ab88728-120f-4d07-91b8-97fe1307e061/network-metrics-daemon/0.log" Apr 17 20:20:09.978992 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:09.978964 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6vrjk_3ab88728-120f-4d07-91b8-97fe1307e061/kube-rbac-proxy/0.log" Apr 17 20:20:11.147979 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:11.147952 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ml4d4_9f323355-87cd-4d74-ba77-22d401a93474/ovn-controller/0.log" Apr 17 20:20:11.175487 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:11.175453 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ml4d4_9f323355-87cd-4d74-ba77-22d401a93474/ovn-acl-logging/0.log" Apr 17 20:20:11.195280 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:11.195248 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ml4d4_9f323355-87cd-4d74-ba77-22d401a93474/kube-rbac-proxy-node/0.log" Apr 17 20:20:11.217342 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:11.217316 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ml4d4_9f323355-87cd-4d74-ba77-22d401a93474/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 20:20:11.237906 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:11.237838 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ml4d4_9f323355-87cd-4d74-ba77-22d401a93474/northd/0.log" Apr 17 20:20:11.258936 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:11.258911 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ml4d4_9f323355-87cd-4d74-ba77-22d401a93474/nbdb/0.log" Apr 17 20:20:11.280056 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:11.280028 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ml4d4_9f323355-87cd-4d74-ba77-22d401a93474/sbdb/0.log" Apr 17 20:20:11.382120 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:11.382088 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ml4d4_9f323355-87cd-4d74-ba77-22d401a93474/ovnkube-controller/0.log" Apr 17 20:20:12.764287 ip-10-0-134-158 kubenswrapper[2567]: I0417 20:20:12.764256 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-lzj47_9b5d0119-32c6-4587-994b-0d70198060ea/network-check-target-container/0.log"