Apr 21 01:46:55.904090 ip-10-0-129-42 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 01:46:55.904107 ip-10-0-129-42 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 01:46:55.904117 ip-10-0-129-42 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 01:46:55.904439 ip-10-0-129-42 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 01:47:05.923211 ip-10-0-129-42 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 01:47:05.923229 ip-10-0-129-42 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 42b0295b72a648eda7129bd63b02d824 -- Apr 21 01:49:20.918229 ip-10-0-129-42 systemd[1]: Starting Kubernetes Kubelet... Apr 21 01:49:21.424811 ip-10-0-129-42 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 01:49:21.424811 ip-10-0-129-42 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 01:49:21.424811 ip-10-0-129-42 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 01:49:21.424811 ip-10-0-129-42 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 01:49:21.424811 ip-10-0-129-42 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 01:49:21.426835 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.426746 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 01:49:21.430247 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430232 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:49:21.430247 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430247 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430251 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430254 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430256 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430259 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430262 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430264 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430267 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430269 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430271 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430274 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430276 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430280 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430283 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430286 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430288 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430291 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430293 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430296 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430298 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:49:21.430313 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430301 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430303 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430306 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430308 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430311 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430313 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430316 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430318 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430321 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430324 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430327 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430329 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430332 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430334 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430337 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430340 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430342 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430345 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430347 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430350 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:49:21.430784 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430352 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430355 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430357 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430360 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430362 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430364 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430369 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430373 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430376 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430378 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430380 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430383 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430386 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430388 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430392 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430396 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430399 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430402 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430404 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:49:21.431276 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430407 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430409 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430412 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430414 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430416 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430419 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430421 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430424 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430428 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430431 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430433 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430436 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430438 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430441 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430443 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430446 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430448 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430450 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430453 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:49:21.431727 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430455 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430458 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430460 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430463 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430465 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430468 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430470 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430833 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430839 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430842 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430845 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430848 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430851 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430854 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430857 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430859 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430862 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430864 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430867 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430869 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:49:21.432203 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430872 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430874 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430877 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430880 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430883 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430885 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430888 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430890 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430892 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430895 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430897 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430900 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430902 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430905 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430909 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430913 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430916 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430918 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430922 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:49:21.432684 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430924 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430928 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430932 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430935 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430938 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430941 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430943 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430945 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430948 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430950 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430952 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430955 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430958 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430961 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430964 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430966 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430984 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430986 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430989 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430992 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:49:21.433167 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430994 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430996 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.430999 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431002 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431004 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431007 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431009 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431012 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431014 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431017 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431019 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431021 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431024 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431027 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431030 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431032 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431035 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431038 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431041 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431043 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:49:21.433672 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431046 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431048 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431051 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431053 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431057 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431059 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431062 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431064 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431066 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431070 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431072 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431075 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431077 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.431080 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432733 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432745 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432752 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432756 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432761 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432764 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432768 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 01:49:21.434182 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432773 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432776 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432779 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432782 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432787 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432790 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432793 2579 flags.go:64] FLAG: --cgroup-root="" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432796 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432799 2579 flags.go:64] FLAG: --client-ca-file="" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432802 2579 flags.go:64] FLAG: --cloud-config="" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432805 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432808 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432811 2579 flags.go:64] FLAG: --cluster-domain="" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432814 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432817 2579 flags.go:64] FLAG: --config-dir="" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432820 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432823 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432827 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432830 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432833 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432836 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432839 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432842 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432845 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432848 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 01:49:21.434678 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432851 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432855 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432858 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432861 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432864 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432867 2579 flags.go:64] FLAG: --enable-server="true" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432869 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432874 2579 flags.go:64] FLAG: --event-burst="100" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432877 2579 flags.go:64] FLAG: --event-qps="50" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432879 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432883 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432886 2579 flags.go:64] FLAG: --eviction-hard="" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432890 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432893 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432896 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432899 2579 flags.go:64] FLAG: --eviction-soft="" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432902 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432905 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432907 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432910 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432913 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432915 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432918 2579 flags.go:64] FLAG: --feature-gates="" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432921 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432924 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 01:49:21.435284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432928 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432931 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432934 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432937 2579 flags.go:64] FLAG: --help="false" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432940 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432943 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432946 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432949 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432952 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432956 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432958 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432961 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432964 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432977 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432981 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432984 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432987 2579 flags.go:64] FLAG: --kube-reserved="" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432990 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432995 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.432998 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433001 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433004 2579 flags.go:64] FLAG: --lock-file="" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433007 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433009 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 01:49:21.435925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433012 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433017 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433020 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433023 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433026 2579 flags.go:64] FLAG: --logging-format="text" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433028 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433032 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433034 2579 flags.go:64] FLAG: --manifest-url="" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433037 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433042 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433045 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433048 2579 flags.go:64] FLAG: --max-pods="110" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433051 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433055 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433058 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433061 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433063 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433066 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433069 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433076 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433079 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433082 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433085 2579 flags.go:64] FLAG: --pod-cidr="" Apr 21 01:49:21.436518 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433087 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433093 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433096 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433099 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433104 2579 flags.go:64] FLAG: --port="10250" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433107 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433110 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0804bdbe3651a1a82" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433113 2579 flags.go:64] FLAG: --qos-reserved="" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433116 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433119 2579 flags.go:64] FLAG: --register-node="true" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433122 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433124 2579 flags.go:64] FLAG: --register-with-taints="" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433128 2579 flags.go:64] FLAG: --registry-burst="10" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433131 2579 flags.go:64] FLAG: --registry-qps="5" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433133 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433136 2579 flags.go:64] FLAG: --reserved-memory="" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433140 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433143 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433146 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433149 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433152 2579 flags.go:64] FLAG: --runonce="false" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433154 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433157 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433161 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433164 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433166 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 01:49:21.437156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433170 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433173 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433176 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433179 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433182 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433185 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433187 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433190 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433193 2579 flags.go:64] FLAG: --system-cgroups="" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433196 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433202 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433205 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433208 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433211 2579 flags.go:64] FLAG: --tls-min-version="" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433214 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433216 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433219 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433222 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433225 2579 flags.go:64] FLAG: --v="2" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433229 2579 flags.go:64] FLAG: --version="false" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433233 2579 flags.go:64] FLAG: --vmodule="" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433238 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.433241 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433332 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:49:21.437792 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433335 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433338 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433341 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433344 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433347 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433350 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433354 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433357 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433360 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433362 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433365 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433367 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433370 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433372 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433375 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433377 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433381 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433383 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433396 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433398 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:49:21.438391 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433401 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433403 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433406 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433408 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433411 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433414 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433416 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433419 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433421 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433423 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433426 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433428 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433431 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433433 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433436 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433438 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433441 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433443 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433446 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433449 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:49:21.438912 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433452 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433454 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433456 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433459 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433461 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433463 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433466 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433468 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433472 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433475 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433479 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433482 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433484 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433487 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433489 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433492 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433494 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433497 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433499 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:49:21.439477 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433502 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433504 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433507 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433509 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433511 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433514 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433516 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433519 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433521 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433524 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433526 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433528 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433531 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433533 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433536 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433538 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433541 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433543 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433545 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433548 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:49:21.439958 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433550 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:49:21.440455 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433554 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:49:21.440455 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433556 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:49:21.440455 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433560 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:49:21.440455 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433562 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:49:21.440455 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.433566 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:49:21.440455 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.434415 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 01:49:21.442492 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.442471 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 01:49:21.442492 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.442489 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442538 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442546 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442550 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442553 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442556 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442559 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442562 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442566 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442570 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442573 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442576 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442579 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442582 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442584 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442587 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442589 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442592 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442594 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:49:21.442613 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442597 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442599 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442602 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442605 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442607 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442609 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442612 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442615 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442618 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442621 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442623 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442626 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442629 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442631 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442633 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442636 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442638 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442641 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442643 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442646 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:49:21.443241 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442648 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442651 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442654 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442656 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442659 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442661 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442664 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442666 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442669 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442671 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442673 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442676 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442678 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442681 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442685 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442688 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442691 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442693 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442696 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442698 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:49:21.443715 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442701 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442703 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442706 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442708 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442710 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442713 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442715 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442718 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442720 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442723 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442725 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442727 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442730 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442732 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442735 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442737 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442739 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442742 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442745 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442748 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:49:21.444255 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442750 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442753 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442755 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442757 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442760 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442762 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442765 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442768 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.442773 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442864 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442868 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442871 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442874 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442877 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442879 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442882 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:49:21.444728 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442884 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442887 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442889 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442892 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442894 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442896 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442899 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442903 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442907 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442909 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442912 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442915 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442918 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442921 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442923 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442927 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442929 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442931 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442934 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442936 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:49:21.445143 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442939 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442942 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442944 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442947 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442950 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442952 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442954 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442957 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442959 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442961 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442964 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442966 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442990 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442995 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.442997 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443000 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443003 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443005 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443008 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:49:21.445628 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443010 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443013 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443015 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443018 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443020 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443023 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443025 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443028 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443030 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443034 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443037 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443040 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443043 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443046 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443049 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443052 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443054 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443057 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443060 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443062 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:49:21.446124 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443065 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443067 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443070 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443072 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443074 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443077 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443079 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443082 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443084 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443087 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443089 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443091 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443094 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443096 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443099 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443101 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443103 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443105 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443108 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:49:21.446590 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:21.443111 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:49:21.447135 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.443115 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 01:49:21.447135 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.444035 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 01:49:21.450565 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.450552 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 01:49:21.451581 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.451571 2579 server.go:1019] "Starting client certificate rotation" Apr 21 01:49:21.451685 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.451665 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 01:49:21.451738 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.451700 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 01:49:21.479595 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.479574 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 01:49:21.482478 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.482456 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 01:49:21.501005 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.500986 2579 log.go:25] "Validated CRI v1 runtime API" Apr 21 01:49:21.509208 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.509186 2579 log.go:25] "Validated CRI v1 image API" Apr 21 01:49:21.510111 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.510094 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 01:49:21.510814 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.510798 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 01:49:21.513300 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.513282 2579 fs.go:135] Filesystem UUIDs: map[29dc0e2b-be65-438b-baa8-ba42e3f15c42:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c2b5bbd8-86d6-42e4-84ef-b0ea5710d4a0:/dev/nvme0n1p3] Apr 21 01:49:21.513377 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.513299 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 01:49:21.518233 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.518127 2579 manager.go:217] Machine: {Timestamp:2026-04-21 01:49:21.516774625 +0000 UTC m=+0.465865911 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097610 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27e922a56b000e52239ec409985c55 SystemUUID:ec27e922-a56b-000e-5223-9ec409985c55 BootID:42b0295b-72a6-48ed-a712-9bd63b02d824 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:de:bc:9e:a1:49 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:de:bc:9e:a1:49 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7e:99:ee:20:9b:dc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 01:49:21.518233 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.518228 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 01:49:21.518340 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.518309 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 01:49:21.521634 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.521611 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 01:49:21.521778 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.521637 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-42.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 01:49:21.521824 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.521788 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 01:49:21.521824 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.521797 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 01:49:21.521824 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.521809 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 01:49:21.521903 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.521829 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 01:49:21.523652 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.523641 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 21 01:49:21.523925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.523916 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 01:49:21.526650 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.526639 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 21 01:49:21.526690 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.526654 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 01:49:21.526690 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.526667 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 01:49:21.526690 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.526676 2579 kubelet.go:397] "Adding apiserver pod source" Apr 21 01:49:21.526796 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.526698 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 01:49:21.527904 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.527892 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 01:49:21.527951 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.527911 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 01:49:21.531290 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.531272 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 01:49:21.533134 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.533121 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 01:49:21.535383 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535370 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 01:49:21.535448 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535389 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 01:49:21.535448 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535396 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 01:49:21.535448 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535402 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 01:49:21.535448 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535408 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 01:49:21.535448 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535417 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 01:49:21.535448 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535426 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 01:49:21.535448 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535431 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 01:49:21.535448 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535438 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 01:49:21.535448 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535445 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 01:49:21.535676 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535453 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 01:49:21.535676 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535462 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 01:49:21.535676 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535492 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 01:49:21.535676 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.535500 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 01:49:21.539429 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.539415 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 01:49:21.539494 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.539452 2579 server.go:1295] "Started kubelet" Apr 21 01:49:21.539578 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.539550 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 01:49:21.539630 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.539552 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 01:49:21.539630 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.539619 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 01:49:21.540450 ip-10-0-129-42 systemd[1]: Started Kubernetes Kubelet. Apr 21 01:49:21.540561 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.540510 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-42.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 01:49:21.541473 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.541325 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 01:49:21.541473 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.541372 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-42.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 01:49:21.541581 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.541041 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 01:49:21.541581 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.541500 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 21 01:49:21.548205 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.547174 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-42.ec2.internal.18a83c2018261086 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-42.ec2.internal,UID:ip-10-0-129-42.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-42.ec2.internal,},FirstTimestamp:2026-04-21 01:49:21.539428486 +0000 UTC m=+0.488519772,LastTimestamp:2026-04-21 01:49:21.539428486 +0000 UTC m=+0.488519772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-42.ec2.internal,}" Apr 21 01:49:21.550187 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.550171 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 01:49:21.550891 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.550858 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 01:49:21.550963 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.550912 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 01:49:21.551506 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.551420 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bsxwq" Apr 21 01:49:21.551781 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.551733 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 01:49:21.551781 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.551741 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 01:49:21.551781 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.551762 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 01:49:21.552065 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.551826 2579 factory.go:55] Registering systemd factory Apr 21 01:49:21.552065 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.551839 2579 factory.go:223] Registration of the systemd container factory successfully Apr 21 01:49:21.552065 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.551889 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:21.552065 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.551913 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 21 01:49:21.552065 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.551920 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 21 01:49:21.552273 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.552164 2579 factory.go:153] Registering CRI-O factory Apr 21 01:49:21.552273 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.552178 2579 factory.go:223] Registration of the crio container factory successfully Apr 21 01:49:21.552273 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.552219 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 01:49:21.552273 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.552239 2579 factory.go:103] Registering Raw factory Apr 21 01:49:21.552273 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.552254 2579 manager.go:1196] Started watching for new ooms in manager Apr 21 01:49:21.552648 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.552632 2579 manager.go:319] Starting recovery of all containers Apr 21 01:49:21.556297 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.556129 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bsxwq" Apr 21 01:49:21.556297 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.556211 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-42.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 01:49:21.556543 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.556518 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 01:49:21.562481 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.562462 2579 manager.go:324] Recovery completed Apr 21 01:49:21.566342 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.566327 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:49:21.568832 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.568811 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:49:21.568885 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.568839 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:49:21.568885 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.568851 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:49:21.569390 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.569377 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 01:49:21.569390 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.569388 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 01:49:21.569472 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.569403 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 21 01:49:21.570932 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.570873 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-42.ec2.internal.18a83c2019e69a76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-42.ec2.internal,UID:ip-10-0-129-42.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-42.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-42.ec2.internal,},FirstTimestamp:2026-04-21 01:49:21.568823926 +0000 UTC m=+0.517915212,LastTimestamp:2026-04-21 01:49:21.568823926 +0000 UTC m=+0.517915212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-42.ec2.internal,}" Apr 21 01:49:21.573290 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.573277 2579 policy_none.go:49] "None policy: Start" Apr 21 01:49:21.573355 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.573294 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 01:49:21.573355 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.573303 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 21 01:49:21.604520 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.604505 2579 manager.go:341] "Starting Device Plugin manager" Apr 21 01:49:21.629046 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.604536 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 01:49:21.629046 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.604546 2579 server.go:85] "Starting device plugin registration server" Apr 21 01:49:21.629046 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.604792 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 01:49:21.629046 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.604805 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 01:49:21.629046 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.604876 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 01:49:21.629046 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.604943 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 01:49:21.629046 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.604951 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 01:49:21.629046 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.605702 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 01:49:21.629046 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.605736 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:21.703312 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.703238 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 01:49:21.704616 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.704596 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 01:49:21.704731 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.704623 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 01:49:21.704731 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.704644 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 01:49:21.704731 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.704652 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 01:49:21.704731 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.704687 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 01:49:21.704896 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.704883 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:49:21.707812 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.707794 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:49:21.707903 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.707834 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:49:21.707903 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.707863 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:49:21.707903 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.707879 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:49:21.708058 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.707908 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.715233 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.715216 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.715313 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.715238 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-42.ec2.internal\": node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:21.733000 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.732961 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:21.805595 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.805551 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-42.ec2.internal"] Apr 21 01:49:21.805693 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.805646 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:49:21.807063 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.807046 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:49:21.807122 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.807081 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:49:21.807122 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.807092 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:49:21.809316 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.809304 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:49:21.809450 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.809437 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.809498 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.809464 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:49:21.810310 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.810275 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:49:21.810310 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.810283 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:49:21.810310 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.810298 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:49:21.810310 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.810303 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:49:21.810310 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.810310 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:49:21.810526 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.810317 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:49:21.812473 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.812458 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.812565 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.812481 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:49:21.813114 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.813100 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:49:21.813196 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.813123 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:49:21.813196 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.813132 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:49:21.833331 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.833312 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:21.840627 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.840606 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-42.ec2.internal\" not found" node="ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.845059 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.845044 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-42.ec2.internal\" not found" node="ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.853221 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.853203 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/58381b68b4623b46ab7cd9e4f4303667-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal\" (UID: \"58381b68b4623b46ab7cd9e4f4303667\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.853271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.853235 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/90d1b61eadce6e4cc6b47dcca75f1db0-config\") pod \"kube-apiserver-proxy-ip-10-0-129-42.ec2.internal\" (UID: \"90d1b61eadce6e4cc6b47dcca75f1db0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.853271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.853251 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/58381b68b4623b46ab7cd9e4f4303667-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal\" (UID: \"58381b68b4623b46ab7cd9e4f4303667\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.934232 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:21.934183 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:21.953577 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.953526 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/58381b68b4623b46ab7cd9e4f4303667-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal\" (UID: \"58381b68b4623b46ab7cd9e4f4303667\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.953577 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.953547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/58381b68b4623b46ab7cd9e4f4303667-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal\" (UID: \"58381b68b4623b46ab7cd9e4f4303667\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.953686 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.953590 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/58381b68b4623b46ab7cd9e4f4303667-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal\" (UID: \"58381b68b4623b46ab7cd9e4f4303667\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.953686 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.953610 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/90d1b61eadce6e4cc6b47dcca75f1db0-config\") pod \"kube-apiserver-proxy-ip-10-0-129-42.ec2.internal\" (UID: \"90d1b61eadce6e4cc6b47dcca75f1db0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.953686 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.953645 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/90d1b61eadce6e4cc6b47dcca75f1db0-config\") pod \"kube-apiserver-proxy-ip-10-0-129-42.ec2.internal\" (UID: \"90d1b61eadce6e4cc6b47dcca75f1db0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-42.ec2.internal" Apr 21 01:49:21.953686 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:21.953678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/58381b68b4623b46ab7cd9e4f4303667-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal\" (UID: \"58381b68b4623b46ab7cd9e4f4303667\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" Apr 21 01:49:22.034957 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:22.034923 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:22.135413 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:22.135374 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:22.142543 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.142525 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" Apr 21 01:49:22.147858 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.147840 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-42.ec2.internal" Apr 21 01:49:22.236329 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:22.236302 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:22.336809 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:22.336777 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:22.390925 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.390895 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:49:22.437741 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:22.437709 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:22.451884 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.451860 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 01:49:22.452046 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.452027 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 01:49:22.452046 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.452032 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 01:49:22.538603 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:22.538579 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:22.550286 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.550263 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 01:49:22.560498 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.560473 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 01:44:21 +0000 UTC" deadline="2027-09-29 02:50:59.614849237 +0000 UTC" Apr 21 01:49:22.560498 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.560498 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12625h1m37.054355788s" Apr 21 01:49:22.562746 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.562730 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 01:49:22.582001 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.581963 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-94zlm" Apr 21 01:49:22.589154 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.589135 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-94zlm" Apr 21 01:49:22.638729 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:22.638702 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:22.727626 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:22.727581 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58381b68b4623b46ab7cd9e4f4303667.slice/crio-9f453509b75673f0192d1312aac87bb73a177b409ab8039931f36cffa7257465 WatchSource:0}: Error finding container 9f453509b75673f0192d1312aac87bb73a177b409ab8039931f36cffa7257465: Status 404 returned error can't find the container with id 9f453509b75673f0192d1312aac87bb73a177b409ab8039931f36cffa7257465 Apr 21 01:49:22.727828 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:22.727812 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90d1b61eadce6e4cc6b47dcca75f1db0.slice/crio-f02f1aa14ad8968ce3e27a4ae5061824239a604b1c6eb272e506afaea1e11d2d WatchSource:0}: Error finding container f02f1aa14ad8968ce3e27a4ae5061824239a604b1c6eb272e506afaea1e11d2d: Status 404 returned error can't find the container with id f02f1aa14ad8968ce3e27a4ae5061824239a604b1c6eb272e506afaea1e11d2d Apr 21 01:49:22.732427 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.732409 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 01:49:22.739332 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:22.739314 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:22.840227 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:22.840133 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:22.912016 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.911998 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:49:22.941058 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:22.941020 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-42.ec2.internal\" not found" Apr 21 01:49:22.956686 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:22.956661 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:49:23.051895 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.051863 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" Apr 21 01:49:23.064197 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.064174 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 01:49:23.065028 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.065016 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-42.ec2.internal" Apr 21 01:49:23.073014 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.072998 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 01:49:23.527561 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.527531 2579 apiserver.go:52] "Watching apiserver" Apr 21 01:49:23.535667 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.535643 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 01:49:23.536120 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.536090 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-ndnpw","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal","openshift-multus/multus-additional-cni-plugins-ljkm7","openshift-multus/multus-cxcst","openshift-multus/network-metrics-daemon-lwrh2","openshift-network-diagnostics/network-check-target-tn9c4","openshift-ovn-kubernetes/ovnkube-node-d79xd","kube-system/konnectivity-agent-jvwm9","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr","openshift-cluster-node-tuning-operator/tuned-v7wv4","openshift-image-registry/node-ca-wv5c7","openshift-network-operator/iptables-alerter-2h8jz","kube-system/kube-apiserver-proxy-ip-10-0-129-42.ec2.internal"] Apr 21 01:49:23.538665 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.538641 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jvwm9" Apr 21 01:49:23.541936 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.541002 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wv5c7" Apr 21 01:49:23.541936 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.541340 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 01:49:23.541936 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.541416 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 01:49:23.541936 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.541461 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rqt2m\"" Apr 21 01:49:23.543374 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.543226 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 01:49:23.543596 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.543577 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gnmgg\"" Apr 21 01:49:23.543827 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.543810 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 01:49:23.544194 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.543993 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 01:49:23.545451 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.545432 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.548660 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.547549 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.548660 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.547622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:23.548660 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:23.547715 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:23.548660 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.548214 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zbrlm\"" Apr 21 01:49:23.548660 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.548222 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 01:49:23.548660 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.548372 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 01:49:23.548660 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.548397 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 01:49:23.548660 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.548524 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 01:49:23.548660 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.548611 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 01:49:23.550054 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.550014 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 01:49:23.550141 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.550018 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 01:49:23.550333 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.550319 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:23.550407 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:23.550379 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:23.550894 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.550874 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h2t9f\"" Apr 21 01:49:23.550990 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.550885 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 01:49:23.552596 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.552577 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.554949 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.554839 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 01:49:23.554949 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.554893 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nc6t6\"" Apr 21 01:49:23.556270 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.555181 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.556270 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.555243 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 01:49:23.556270 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.555607 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 01:49:23.556270 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.555608 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 01:49:23.556270 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.555809 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 01:49:23.557630 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.557535 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 01:49:23.557806 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.557785 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-f7p8j\"" Apr 21 01:49:23.557908 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.557871 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 01:49:23.558801 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.558780 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ndnpw" Apr 21 01:49:23.561394 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.561370 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 01:49:23.561870 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.561626 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4gxfw\"" Apr 21 01:49:23.561870 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.561693 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 01:49:23.562470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.562449 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-var-lib-openvswitch\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.562569 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.562484 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-ovnkube-config\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.562616 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.562511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:23.562616 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.562609 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69b70ba3-d144-4882-86ae-cb9a2e4391a4-serviceca\") pod \"node-ca-wv5c7\" (UID: \"69b70ba3-d144-4882-86ae-cb9a2e4391a4\") " pod="openshift-image-registry/node-ca-wv5c7" Apr 21 01:49:23.562679 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.562625 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16bf952c-6936-44eb-a262-16dac395d351-os-release\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.562679 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.562641 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-run-netns\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.562737 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.562679 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-cni-netd\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.562737 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.562712 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-ovnkube-script-lib\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.562794 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.562744 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rhls\" (UniqueName: \"kubernetes.io/projected/202b33ea-296c-464b-8ee5-774d048859c2-kube-api-access-5rhls\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:23.563256 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563238 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-run-systemd\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.563338 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16bf952c-6936-44eb-a262-16dac395d351-cni-binary-copy\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.563338 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563285 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-registration-dir\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.563338 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563303 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-systemd-units\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.563338 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563317 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-run-openvswitch\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563654 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-cni-bin\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563699 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-env-overrides\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563735 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtlmg\" (UniqueName: \"kubernetes.io/projected/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-kube-api-access-wtlmg\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563771 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16bf952c-6936-44eb-a262-16dac395d351-cnibin\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563806 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16bf952c-6936-44eb-a262-16dac395d351-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563839 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-sys-fs\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563870 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-slash\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563897 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-node-log\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563932 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e0901c2f-3720-4876-9776-5d298a25b8a3-agent-certs\") pod \"konnectivity-agent-jvwm9\" (UID: \"e0901c2f-3720-4876-9776-5d298a25b8a3\") " pod="kube-system/konnectivity-agent-jvwm9" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563966 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/16bf952c-6936-44eb-a262-16dac395d351-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.563961 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2h8jz" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564019 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jd8r\" (UniqueName: \"kubernetes.io/projected/62c83f40-5302-4d3d-a41f-36cd9302f7f0-kube-api-access-7jd8r\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564212 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-etc-openvswitch\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564239 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-ovn-node-metrics-cert\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564271 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e0901c2f-3720-4876-9776-5d298a25b8a3-konnectivity-ca\") pod \"konnectivity-agent-jvwm9\" (UID: \"e0901c2f-3720-4876-9776-5d298a25b8a3\") " pod="kube-system/konnectivity-agent-jvwm9" Apr 21 01:49:23.565958 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564302 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69b70ba3-d144-4882-86ae-cb9a2e4391a4-host\") pod \"node-ca-wv5c7\" (UID: \"69b70ba3-d144-4882-86ae-cb9a2e4391a4\") " pod="openshift-image-registry/node-ca-wv5c7" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564333 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j542v\" (UniqueName: \"kubernetes.io/projected/69b70ba3-d144-4882-86ae-cb9a2e4391a4-kube-api-access-j542v\") pod \"node-ca-wv5c7\" (UID: \"69b70ba3-d144-4882-86ae-cb9a2e4391a4\") " pod="openshift-image-registry/node-ca-wv5c7" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564364 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16bf952c-6936-44eb-a262-16dac395d351-system-cni-dir\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564396 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfx77\" (UniqueName: \"kubernetes.io/projected/16bf952c-6936-44eb-a262-16dac395d351-kube-api-access-lfx77\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564524 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-kubelet\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564556 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-log-socket\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564586 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-run-ovn-kubernetes\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564618 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nvvh\" (UniqueName: \"kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh\") pod \"network-check-target-tn9c4\" (UID: \"a1f54627-ed53-4b78-85cb-f07dd7bc3869\") " pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564664 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-socket-dir\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564791 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-run-ovn\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564832 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564864 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/16bf952c-6936-44eb-a262-16dac395d351-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564895 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-device-dir\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.564925 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-etc-selinux\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.565050 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.566861 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.566603 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:49:23.570064 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.570043 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 01:49:23.570352 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.570323 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:49:23.570436 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.570417 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fqxf6\"" Apr 21 01:49:23.570511 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.570492 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 01:49:23.570759 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.570728 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 01:49:23.573545 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.573507 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nhnc9\"" Apr 21 01:49:23.591163 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.591139 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 01:44:22 +0000 UTC" deadline="2027-12-07 13:21:32.370595121 +0000 UTC" Apr 21 01:49:23.591163 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.591161 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14291h32m8.779436128s" Apr 21 01:49:23.610931 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.610903 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:49:23.652617 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.652595 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 01:49:23.665591 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16bf952c-6936-44eb-a262-16dac395d351-cni-binary-copy\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.665743 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-run-openvswitch\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.665743 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665629 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-cni-bin\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.665743 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665674 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtlmg\" (UniqueName: \"kubernetes.io/projected/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-kube-api-access-wtlmg\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.665743 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-run-openvswitch\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.665743 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665706 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43369dc-2b67-4f61-bd42-5f76536f5501-tmp\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.665743 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665734 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-multus-cni-dir\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-var-lib-cni-bin\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-cni-bin\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665787 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665814 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-ovn-node-metrics-cert\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665842 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/286bb4a3-718c-4b23-ae28-f66678e8128c-iptables-alerter-script\") pod \"iptables-alerter-2h8jz\" (UID: \"286bb4a3-718c-4b23-ae28-f66678e8128c\") " pod="openshift-network-operator/iptables-alerter-2h8jz" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665869 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e0901c2f-3720-4876-9776-5d298a25b8a3-konnectivity-ca\") pod \"konnectivity-agent-jvwm9\" (UID: \"e0901c2f-3720-4876-9776-5d298a25b8a3\") " pod="kube-system/konnectivity-agent-jvwm9" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665873 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665894 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j542v\" (UniqueName: \"kubernetes.io/projected/69b70ba3-d144-4882-86ae-cb9a2e4391a4-kube-api-access-j542v\") pod \"node-ca-wv5c7\" (UID: \"69b70ba3-d144-4882-86ae-cb9a2e4391a4\") " pod="openshift-image-registry/node-ca-wv5c7" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfx77\" (UniqueName: \"kubernetes.io/projected/16bf952c-6936-44eb-a262-16dac395d351-kube-api-access-lfx77\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-log-socket\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.665966 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-run-ovn-kubernetes\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666008 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666025 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86fcl\" (UniqueName: \"kubernetes.io/projected/18adaff4-c987-4e36-8131-4023c56e79c0-kube-api-access-86fcl\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-sysconfig\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvvh\" (UniqueName: \"kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh\") pod \"network-check-target-tn9c4\" (UID: \"a1f54627-ed53-4b78-85cb-f07dd7bc3869\") " pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:23.666081 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666085 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-socket-dir\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666102 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-kubernetes\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666118 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-var-lib-kubelet\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/16bf952c-6936-44eb-a262-16dac395d351-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666178 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-device-dir\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-etc-selinux\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666230 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-var-lib-kubelet\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666231 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/16bf952c-6936-44eb-a262-16dac395d351-cni-binary-copy\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666231 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666322 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666356 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-socket-dir\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666370 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-log-socket\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666433 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e0901c2f-3720-4876-9776-5d298a25b8a3-konnectivity-ca\") pod \"konnectivity-agent-jvwm9\" (UID: \"e0901c2f-3720-4876-9776-5d298a25b8a3\") " pod="kube-system/konnectivity-agent-jvwm9" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666450 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-run-ovn-kubernetes\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666450 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-device-dir\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666259 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-run-multus-certs\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666562 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-etc-selinux\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.666728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666571 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-run-netns\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-cni-netd\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666633 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-ovnkube-script-lib\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666655 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-cni-netd\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666673 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-lib-modules\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666609 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-run-netns\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666790 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rhls\" (UniqueName: \"kubernetes.io/projected/202b33ea-296c-464b-8ee5-774d048859c2-kube-api-access-5rhls\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666802 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/16bf952c-6936-44eb-a262-16dac395d351-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666826 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-run-systemd\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666858 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-etc-kubernetes\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666885 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-tuned\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666913 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-registration-dir\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666934 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-run-systemd\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-systemd-units\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.666997 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-systemd-units\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667002 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-env-overrides\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667027 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-var-lib-cni-multus\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.667582 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667067 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-multus-conf-dir\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667117 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-registration-dir\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-run\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16bf952c-6936-44eb-a262-16dac395d351-cnibin\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667218 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-ovnkube-script-lib\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667236 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/16bf952c-6936-44eb-a262-16dac395d351-cnibin\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667239 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16bf952c-6936-44eb-a262-16dac395d351-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-sys-fs\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667283 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-slash\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667298 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-node-log\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667325 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw8fw\" (UniqueName: \"kubernetes.io/projected/a45eec20-9193-41a2-a65b-20a0623f23d5-kube-api-access-hw8fw\") pod \"node-resolver-ndnpw\" (UID: \"a45eec20-9193-41a2-a65b-20a0623f23d5\") " pod="openshift-dns/node-resolver-ndnpw" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667346 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-slash\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667355 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-env-overrides\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667348 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-sys\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667371 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/16bf952c-6936-44eb-a262-16dac395d351-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667391 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-node-log\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e0901c2f-3720-4876-9776-5d298a25b8a3-agent-certs\") pod \"konnectivity-agent-jvwm9\" (UID: \"e0901c2f-3720-4876-9776-5d298a25b8a3\") " pod="kube-system/konnectivity-agent-jvwm9" Apr 21 01:49:23.668470 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/16bf952c-6936-44eb-a262-16dac395d351-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jd8r\" (UniqueName: \"kubernetes.io/projected/62c83f40-5302-4d3d-a41f-36cd9302f7f0-kube-api-access-7jd8r\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667597 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/62c83f40-5302-4d3d-a41f-36cd9302f7f0-sys-fs\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-etc-openvswitch\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667684 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-etc-openvswitch\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667720 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-sysctl-d\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667749 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-sysctl-conf\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667802 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/286bb4a3-718c-4b23-ae28-f66678e8128c-host-slash\") pod \"iptables-alerter-2h8jz\" (UID: \"286bb4a3-718c-4b23-ae28-f66678e8128c\") " pod="openshift-network-operator/iptables-alerter-2h8jz" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667823 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69b70ba3-d144-4882-86ae-cb9a2e4391a4-host\") pod \"node-ca-wv5c7\" (UID: \"69b70ba3-d144-4882-86ae-cb9a2e4391a4\") " pod="openshift-image-registry/node-ca-wv5c7" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667853 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16bf952c-6936-44eb-a262-16dac395d351-system-cni-dir\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667876 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-kubelet\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667878 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69b70ba3-d144-4882-86ae-cb9a2e4391a4-host\") pod \"node-ca-wv5c7\" (UID: \"69b70ba3-d144-4882-86ae-cb9a2e4391a4\") " pod="openshift-image-registry/node-ca-wv5c7" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/16bf952c-6936-44eb-a262-16dac395d351-system-cni-dir\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667930 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-host-kubelet\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667944 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-run-netns\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-run-ovn\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.667997 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/16bf952c-6936-44eb-a262-16dac395d351-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.669258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668013 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-cnibin\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668029 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-os-release\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668055 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-run-ovn\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668059 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18adaff4-c987-4e36-8131-4023c56e79c0-cni-binary-copy\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668105 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a45eec20-9193-41a2-a65b-20a0623f23d5-hosts-file\") pod \"node-resolver-ndnpw\" (UID: \"a45eec20-9193-41a2-a65b-20a0623f23d5\") " pod="openshift-dns/node-resolver-ndnpw" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668129 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrdjc\" (UniqueName: \"kubernetes.io/projected/286bb4a3-718c-4b23-ae28-f66678e8128c-kube-api-access-zrdjc\") pod \"iptables-alerter-2h8jz\" (UID: \"286bb4a3-718c-4b23-ae28-f66678e8128c\") " pod="openshift-network-operator/iptables-alerter-2h8jz" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668158 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-var-lib-openvswitch\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668183 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-ovnkube-config\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668210 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-system-cni-dir\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668229 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-var-lib-openvswitch\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668238 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-multus-socket-dir-parent\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668341 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a45eec20-9193-41a2-a65b-20a0623f23d5-tmp-dir\") pod \"node-resolver-ndnpw\" (UID: \"a45eec20-9193-41a2-a65b-20a0623f23d5\") " pod="openshift-dns/node-resolver-ndnpw" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668370 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-modprobe-d\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-systemd\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668420 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668441 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69b70ba3-d144-4882-86ae-cb9a2e4391a4-serviceca\") pod \"node-ca-wv5c7\" (UID: \"69b70ba3-d144-4882-86ae-cb9a2e4391a4\") " pod="openshift-image-registry/node-ca-wv5c7" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16bf952c-6936-44eb-a262-16dac395d351-os-release\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.670107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668490 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-run-k8s-cni-cncf-io\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.670891 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668574 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/16bf952c-6936-44eb-a262-16dac395d351-os-release\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.670891 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:23.668646 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:23.670891 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668667 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-host\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.670891 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:23.668726 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs podName:202b33ea-296c-464b-8ee5-774d048859c2 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:24.168686993 +0000 UTC m=+3.117778266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs") pod "network-metrics-daemon-lwrh2" (UID: "202b33ea-296c-464b-8ee5-774d048859c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:23.670891 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668742 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-hostroot\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.670891 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668781 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/18adaff4-c987-4e36-8131-4023c56e79c0-multus-daemon-config\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.670891 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.668924 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnm7v\" (UniqueName: \"kubernetes.io/projected/f43369dc-2b67-4f61-bd42-5f76536f5501-kube-api-access-xnm7v\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.670891 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.669181 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-ovnkube-config\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.670891 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.669287 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/69b70ba3-d144-4882-86ae-cb9a2e4391a4-serviceca\") pod \"node-ca-wv5c7\" (UID: \"69b70ba3-d144-4882-86ae-cb9a2e4391a4\") " pod="openshift-image-registry/node-ca-wv5c7" Apr 21 01:49:23.671658 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.671639 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-ovn-node-metrics-cert\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.671781 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.671762 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e0901c2f-3720-4876-9776-5d298a25b8a3-agent-certs\") pod \"konnectivity-agent-jvwm9\" (UID: \"e0901c2f-3720-4876-9776-5d298a25b8a3\") " pod="kube-system/konnectivity-agent-jvwm9" Apr 21 01:49:23.673700 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:23.673679 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:49:23.673804 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:23.673704 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:49:23.673804 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:23.673717 2579 projected.go:194] Error preparing data for projected volume kube-api-access-8nvvh for pod openshift-network-diagnostics/network-check-target-tn9c4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:23.673804 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:23.673794 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh podName:a1f54627-ed53-4b78-85cb-f07dd7bc3869 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:24.173777134 +0000 UTC m=+3.122868427 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8nvvh" (UniqueName: "kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh") pod "network-check-target-tn9c4" (UID: "a1f54627-ed53-4b78-85cb-f07dd7bc3869") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:23.677187 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.677157 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j542v\" (UniqueName: \"kubernetes.io/projected/69b70ba3-d144-4882-86ae-cb9a2e4391a4-kube-api-access-j542v\") pod \"node-ca-wv5c7\" (UID: \"69b70ba3-d144-4882-86ae-cb9a2e4391a4\") " pod="openshift-image-registry/node-ca-wv5c7" Apr 21 01:49:23.677271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.677254 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfx77\" (UniqueName: \"kubernetes.io/projected/16bf952c-6936-44eb-a262-16dac395d351-kube-api-access-lfx77\") pod \"multus-additional-cni-plugins-ljkm7\" (UID: \"16bf952c-6936-44eb-a262-16dac395d351\") " pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.677420 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.677401 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rhls\" (UniqueName: \"kubernetes.io/projected/202b33ea-296c-464b-8ee5-774d048859c2-kube-api-access-5rhls\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:23.677511 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.677485 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtlmg\" (UniqueName: \"kubernetes.io/projected/4f79c179-ef2b-42fc-ae2f-cbf567b17a05-kube-api-access-wtlmg\") pod \"ovnkube-node-d79xd\" (UID: \"4f79c179-ef2b-42fc-ae2f-cbf567b17a05\") " pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.678556 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.678525 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jd8r\" (UniqueName: \"kubernetes.io/projected/62c83f40-5302-4d3d-a41f-36cd9302f7f0-kube-api-access-7jd8r\") pod \"aws-ebs-csi-driver-node-pvfrr\" (UID: \"62c83f40-5302-4d3d-a41f-36cd9302f7f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.708802 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.708750 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" event={"ID":"58381b68b4623b46ab7cd9e4f4303667","Type":"ContainerStarted","Data":"9f453509b75673f0192d1312aac87bb73a177b409ab8039931f36cffa7257465"} Apr 21 01:49:23.709675 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.709652 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-42.ec2.internal" event={"ID":"90d1b61eadce6e4cc6b47dcca75f1db0","Type":"ContainerStarted","Data":"f02f1aa14ad8968ce3e27a4ae5061824239a604b1c6eb272e506afaea1e11d2d"} Apr 21 01:49:23.769996 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.769955 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-cnibin\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.769996 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.769998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-os-release\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770193 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770027 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18adaff4-c987-4e36-8131-4023c56e79c0-cni-binary-copy\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770193 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770054 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a45eec20-9193-41a2-a65b-20a0623f23d5-hosts-file\") pod \"node-resolver-ndnpw\" (UID: \"a45eec20-9193-41a2-a65b-20a0623f23d5\") " pod="openshift-dns/node-resolver-ndnpw" Apr 21 01:49:23.770193 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770074 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrdjc\" (UniqueName: \"kubernetes.io/projected/286bb4a3-718c-4b23-ae28-f66678e8128c-kube-api-access-zrdjc\") pod \"iptables-alerter-2h8jz\" (UID: \"286bb4a3-718c-4b23-ae28-f66678e8128c\") " pod="openshift-network-operator/iptables-alerter-2h8jz" Apr 21 01:49:23.770193 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770090 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-system-cni-dir\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770193 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770098 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-cnibin\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770193 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770114 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-os-release\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770193 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-multus-socket-dir-parent\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770193 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770146 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-multus-socket-dir-parent\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770193 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a45eec20-9193-41a2-a65b-20a0623f23d5-tmp-dir\") pod \"node-resolver-ndnpw\" (UID: \"a45eec20-9193-41a2-a65b-20a0623f23d5\") " pod="openshift-dns/node-resolver-ndnpw" Apr 21 01:49:23.770193 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770184 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-system-cni-dir\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770193 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770193 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-modprobe-d\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770218 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-systemd\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770258 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-run-k8s-cni-cncf-io\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770282 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-host\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770307 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-hostroot\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770352 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/18adaff4-c987-4e36-8131-4023c56e79c0-multus-daemon-config\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770376 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnm7v\" (UniqueName: \"kubernetes.io/projected/f43369dc-2b67-4f61-bd42-5f76536f5501-kube-api-access-xnm7v\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770404 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43369dc-2b67-4f61-bd42-5f76536f5501-tmp\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-multus-cni-dir\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770439 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a45eec20-9193-41a2-a65b-20a0623f23d5-tmp-dir\") pod \"node-resolver-ndnpw\" (UID: \"a45eec20-9193-41a2-a65b-20a0623f23d5\") " pod="openshift-dns/node-resolver-ndnpw" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770451 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-var-lib-cni-bin\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770458 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a45eec20-9193-41a2-a65b-20a0623f23d5-hosts-file\") pod \"node-resolver-ndnpw\" (UID: \"a45eec20-9193-41a2-a65b-20a0623f23d5\") " pod="openshift-dns/node-resolver-ndnpw" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/286bb4a3-718c-4b23-ae28-f66678e8128c-iptables-alerter-script\") pod \"iptables-alerter-2h8jz\" (UID: \"286bb4a3-718c-4b23-ae28-f66678e8128c\") " pod="openshift-network-operator/iptables-alerter-2h8jz" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770514 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86fcl\" (UniqueName: \"kubernetes.io/projected/18adaff4-c987-4e36-8131-4023c56e79c0-kube-api-access-86fcl\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770541 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-sysconfig\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.770604 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770596 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-kubernetes\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770621 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-var-lib-kubelet\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770626 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/18adaff4-c987-4e36-8131-4023c56e79c0-cni-binary-copy\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770648 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-var-lib-kubelet\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770677 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-run-multus-certs\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770705 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-lib-modules\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770719 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-systemd\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770771 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-multus-cni-dir\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770805 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-var-lib-cni-bin\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-modprobe-d\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770917 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-lib-modules\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770916 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/18adaff4-c987-4e36-8131-4023c56e79c0-multus-daemon-config\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-hostroot\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.770963 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-etc-kubernetes\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771076 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-host\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771090 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-sysconfig\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-tuned\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771139 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-var-lib-cni-multus\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.771271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771141 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-var-lib-kubelet\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-multus-conf-dir\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771225 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-run\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771277 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/286bb4a3-718c-4b23-ae28-f66678e8128c-iptables-alerter-script\") pod \"iptables-alerter-2h8jz\" (UID: \"286bb4a3-718c-4b23-ae28-f66678e8128c\") " pod="openshift-network-operator/iptables-alerter-2h8jz" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771279 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-var-lib-cni-multus\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771303 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-run-multus-certs\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771305 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-etc-kubernetes\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771340 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-run-k8s-cni-cncf-io\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771347 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-var-lib-kubelet\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771359 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-multus-conf-dir\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771387 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-run\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771433 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-kubernetes\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771465 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw8fw\" (UniqueName: \"kubernetes.io/projected/a45eec20-9193-41a2-a65b-20a0623f23d5-kube-api-access-hw8fw\") pod \"node-resolver-ndnpw\" (UID: \"a45eec20-9193-41a2-a65b-20a0623f23d5\") " pod="openshift-dns/node-resolver-ndnpw" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771504 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-sys\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-sysctl-d\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-sysctl-conf\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/286bb4a3-718c-4b23-ae28-f66678e8128c-host-slash\") pod \"iptables-alerter-2h8jz\" (UID: \"286bb4a3-718c-4b23-ae28-f66678e8128c\") " pod="openshift-network-operator/iptables-alerter-2h8jz" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-run-netns\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.772079 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771719 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/18adaff4-c987-4e36-8131-4023c56e79c0-host-run-netns\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.772733 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771757 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-sysctl-conf\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.772733 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771791 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/286bb4a3-718c-4b23-ae28-f66678e8128c-host-slash\") pod \"iptables-alerter-2h8jz\" (UID: \"286bb4a3-718c-4b23-ae28-f66678e8128c\") " pod="openshift-network-operator/iptables-alerter-2h8jz" Apr 21 01:49:23.772733 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771839 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-sysctl-d\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.772733 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.771853 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f43369dc-2b67-4f61-bd42-5f76536f5501-sys\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.772901 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.772880 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f43369dc-2b67-4f61-bd42-5f76536f5501-tmp\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.773341 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.773314 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f43369dc-2b67-4f61-bd42-5f76536f5501-etc-tuned\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.779447 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.779368 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86fcl\" (UniqueName: \"kubernetes.io/projected/18adaff4-c987-4e36-8131-4023c56e79c0-kube-api-access-86fcl\") pod \"multus-cxcst\" (UID: \"18adaff4-c987-4e36-8131-4023c56e79c0\") " pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.779447 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.779366 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnm7v\" (UniqueName: \"kubernetes.io/projected/f43369dc-2b67-4f61-bd42-5f76536f5501-kube-api-access-xnm7v\") pod \"tuned-v7wv4\" (UID: \"f43369dc-2b67-4f61-bd42-5f76536f5501\") " pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:23.779776 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.779755 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw8fw\" (UniqueName: \"kubernetes.io/projected/a45eec20-9193-41a2-a65b-20a0623f23d5-kube-api-access-hw8fw\") pod \"node-resolver-ndnpw\" (UID: \"a45eec20-9193-41a2-a65b-20a0623f23d5\") " pod="openshift-dns/node-resolver-ndnpw" Apr 21 01:49:23.779828 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.779768 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrdjc\" (UniqueName: \"kubernetes.io/projected/286bb4a3-718c-4b23-ae28-f66678e8128c-kube-api-access-zrdjc\") pod \"iptables-alerter-2h8jz\" (UID: \"286bb4a3-718c-4b23-ae28-f66678e8128c\") " pod="openshift-network-operator/iptables-alerter-2h8jz" Apr 21 01:49:23.852693 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.852655 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jvwm9" Apr 21 01:49:23.862308 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.862286 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wv5c7" Apr 21 01:49:23.877988 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.877940 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ljkm7" Apr 21 01:49:23.883633 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.883611 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" Apr 21 01:49:23.891280 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.891258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:23.897880 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.897860 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cxcst" Apr 21 01:49:23.905431 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.905414 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ndnpw" Apr 21 01:49:23.912010 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.911993 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2h8jz" Apr 21 01:49:23.917625 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:23.917608 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" Apr 21 01:49:24.173638 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.173568 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:24.173801 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:24.173702 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:24.173801 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:24.173757 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs podName:202b33ea-296c-464b-8ee5-774d048859c2 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:25.173742231 +0000 UTC m=+4.122833504 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs") pod "network-metrics-daemon-lwrh2" (UID: "202b33ea-296c-464b-8ee5-774d048859c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:24.274237 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.274195 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvvh\" (UniqueName: \"kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh\") pod \"network-check-target-tn9c4\" (UID: \"a1f54627-ed53-4b78-85cb-f07dd7bc3869\") " pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:24.274397 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:24.274323 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:49:24.274397 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:24.274343 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:49:24.274397 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:24.274357 2579 projected.go:194] Error preparing data for projected volume kube-api-access-8nvvh for pod openshift-network-diagnostics/network-check-target-tn9c4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:24.274531 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:24.274410 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh podName:a1f54627-ed53-4b78-85cb-f07dd7bc3869 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:25.274395945 +0000 UTC m=+4.223487218 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8nvvh" (UniqueName: "kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh") pod "network-check-target-tn9c4" (UID: "a1f54627-ed53-4b78-85cb-f07dd7bc3869") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:24.400335 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:24.400301 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0901c2f_3720_4876_9776_5d298a25b8a3.slice/crio-9d3a1db7addd07573c0dba1079fe0885bfa625ddfccf9b0b4eba41ce396d527b WatchSource:0}: Error finding container 9d3a1db7addd07573c0dba1079fe0885bfa625ddfccf9b0b4eba41ce396d527b: Status 404 returned error can't find the container with id 9d3a1db7addd07573c0dba1079fe0885bfa625ddfccf9b0b4eba41ce396d527b Apr 21 01:49:24.405370 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:24.405116 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf43369dc_2b67_4f61_bd42_5f76536f5501.slice/crio-289a12675402120ccb06cb85b455559a4fe66aa70768a64471af6e28834b4900 WatchSource:0}: Error finding container 289a12675402120ccb06cb85b455559a4fe66aa70768a64471af6e28834b4900: Status 404 returned error can't find the container with id 289a12675402120ccb06cb85b455559a4fe66aa70768a64471af6e28834b4900 Apr 21 01:49:24.412916 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:24.412884 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16bf952c_6936_44eb_a262_16dac395d351.slice/crio-aa11ccd47532e424f3d2f8bd8718aece7743c08423ebabdc50ed8840efeeed2a WatchSource:0}: Error finding container aa11ccd47532e424f3d2f8bd8718aece7743c08423ebabdc50ed8840efeeed2a: Status 404 returned error can't find the container with id aa11ccd47532e424f3d2f8bd8718aece7743c08423ebabdc50ed8840efeeed2a Apr 21 01:49:24.434336 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:24.434309 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62c83f40_5302_4d3d_a41f_36cd9302f7f0.slice/crio-3a79fc67c04c829e0a151c123d9a339c53376a54f518eca0f39fae117601b8eb WatchSource:0}: Error finding container 3a79fc67c04c829e0a151c123d9a339c53376a54f518eca0f39fae117601b8eb: Status 404 returned error can't find the container with id 3a79fc67c04c829e0a151c123d9a339c53376a54f518eca0f39fae117601b8eb Apr 21 01:49:24.434801 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:24.434778 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f79c179_ef2b_42fc_ae2f_cbf567b17a05.slice/crio-73c71807400999e8e1a095bbf4b71e980fe144cd1a382af85ba98d8b4a408d9e WatchSource:0}: Error finding container 73c71807400999e8e1a095bbf4b71e980fe144cd1a382af85ba98d8b4a408d9e: Status 404 returned error can't find the container with id 73c71807400999e8e1a095bbf4b71e980fe144cd1a382af85ba98d8b4a408d9e Apr 21 01:49:24.435682 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:24.435647 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda45eec20_9193_41a2_a65b_20a0623f23d5.slice/crio-1f4dd037e3d31af881d5a5a5dc2bd006dd591b66e4aa60789ee05a7f190d148a WatchSource:0}: Error finding container 1f4dd037e3d31af881d5a5a5dc2bd006dd591b66e4aa60789ee05a7f190d148a: Status 404 returned error can't find the container with id 1f4dd037e3d31af881d5a5a5dc2bd006dd591b66e4aa60789ee05a7f190d148a Apr 21 01:49:24.591293 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.591260 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 01:44:22 +0000 UTC" deadline="2027-10-31 03:38:47.802768734 +0000 UTC" Apr 21 01:49:24.591293 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.591289 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13393h49m23.211482493s" Apr 21 01:49:24.711898 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.711785 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wv5c7" event={"ID":"69b70ba3-d144-4882-86ae-cb9a2e4391a4","Type":"ContainerStarted","Data":"a5e3b77c2a11bbfe719f3b4272dc3ce3e95293fe0271c5811276dd4f4a0a213e"} Apr 21 01:49:24.713273 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.713246 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-42.ec2.internal" event={"ID":"90d1b61eadce6e4cc6b47dcca75f1db0","Type":"ContainerStarted","Data":"d80f947561680f2f8ad6d209ebf212035ce3f1014cdcfa293fb70b323211b83d"} Apr 21 01:49:24.715630 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.715577 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ndnpw" event={"ID":"a45eec20-9193-41a2-a65b-20a0623f23d5","Type":"ContainerStarted","Data":"1f4dd037e3d31af881d5a5a5dc2bd006dd591b66e4aa60789ee05a7f190d148a"} Apr 21 01:49:24.717510 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.717489 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" event={"ID":"4f79c179-ef2b-42fc-ae2f-cbf567b17a05","Type":"ContainerStarted","Data":"73c71807400999e8e1a095bbf4b71e980fe144cd1a382af85ba98d8b4a408d9e"} Apr 21 01:49:24.718651 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.718631 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" event={"ID":"62c83f40-5302-4d3d-a41f-36cd9302f7f0","Type":"ContainerStarted","Data":"3a79fc67c04c829e0a151c123d9a339c53376a54f518eca0f39fae117601b8eb"} Apr 21 01:49:24.719561 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.719542 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljkm7" event={"ID":"16bf952c-6936-44eb-a262-16dac395d351","Type":"ContainerStarted","Data":"aa11ccd47532e424f3d2f8bd8718aece7743c08423ebabdc50ed8840efeeed2a"} Apr 21 01:49:24.720362 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.720344 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jvwm9" event={"ID":"e0901c2f-3720-4876-9776-5d298a25b8a3","Type":"ContainerStarted","Data":"9d3a1db7addd07573c0dba1079fe0885bfa625ddfccf9b0b4eba41ce396d527b"} Apr 21 01:49:24.721289 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.721264 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cxcst" event={"ID":"18adaff4-c987-4e36-8131-4023c56e79c0","Type":"ContainerStarted","Data":"1b4cca82661d97d5c2dd788ee00aeaae637d10535838e7d6b7e930948c802f5f"} Apr 21 01:49:24.722427 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.722393 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2h8jz" event={"ID":"286bb4a3-718c-4b23-ae28-f66678e8128c","Type":"ContainerStarted","Data":"c32c1752a0ca0f979f48fcdd629f6ab9d22f08c3ab8165d3d1bcf9a46f3d843c"} Apr 21 01:49:24.723254 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.723228 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" event={"ID":"f43369dc-2b67-4f61-bd42-5f76536f5501","Type":"ContainerStarted","Data":"289a12675402120ccb06cb85b455559a4fe66aa70768a64471af6e28834b4900"} Apr 21 01:49:24.725893 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:24.725852 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-42.ec2.internal" podStartSLOduration=1.725841465 podStartE2EDuration="1.725841465s" podCreationTimestamp="2026-04-21 01:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:49:24.725606559 +0000 UTC m=+3.674697854" watchObservedRunningTime="2026-04-21 01:49:24.725841465 +0000 UTC m=+3.674932759" Apr 21 01:49:25.085189 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:25.085155 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:49:25.181878 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:25.181834 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:25.182085 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:25.182066 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:25.182180 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:25.182137 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs podName:202b33ea-296c-464b-8ee5-774d048859c2 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:27.18211532 +0000 UTC m=+6.131206599 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs") pod "network-metrics-daemon-lwrh2" (UID: "202b33ea-296c-464b-8ee5-774d048859c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:25.282239 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:25.282199 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvvh\" (UniqueName: \"kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh\") pod \"network-check-target-tn9c4\" (UID: \"a1f54627-ed53-4b78-85cb-f07dd7bc3869\") " pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:25.282512 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:25.282493 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:49:25.282605 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:25.282519 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:49:25.282605 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:25.282532 2579 projected.go:194] Error preparing data for projected volume kube-api-access-8nvvh for pod openshift-network-diagnostics/network-check-target-tn9c4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:25.282605 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:25.282594 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh podName:a1f54627-ed53-4b78-85cb-f07dd7bc3869 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:27.282575258 +0000 UTC m=+6.231666546 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8nvvh" (UniqueName: "kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh") pod "network-check-target-tn9c4" (UID: "a1f54627-ed53-4b78-85cb-f07dd7bc3869") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:25.706215 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:25.706185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:25.706744 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:25.706320 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:25.706744 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:25.706380 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:25.706744 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:25.706515 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:25.746322 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:25.746280 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" event={"ID":"58381b68b4623b46ab7cd9e4f4303667","Type":"ContainerStarted","Data":"127bce510b2de5064d3b49ed85cc706aedfc363478107f026aafbdbb1f386823"} Apr 21 01:49:26.772022 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:26.771320 2579 generic.go:358] "Generic (PLEG): container finished" podID="58381b68b4623b46ab7cd9e4f4303667" containerID="127bce510b2de5064d3b49ed85cc706aedfc363478107f026aafbdbb1f386823" exitCode=0 Apr 21 01:49:26.772022 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:26.771397 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" event={"ID":"58381b68b4623b46ab7cd9e4f4303667","Type":"ContainerDied","Data":"127bce510b2de5064d3b49ed85cc706aedfc363478107f026aafbdbb1f386823"} Apr 21 01:49:27.197590 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:27.197554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:27.197758 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:27.197687 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:27.197758 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:27.197737 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs podName:202b33ea-296c-464b-8ee5-774d048859c2 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:31.197723885 +0000 UTC m=+10.146815158 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs") pod "network-metrics-daemon-lwrh2" (UID: "202b33ea-296c-464b-8ee5-774d048859c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:27.298437 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:27.298404 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvvh\" (UniqueName: \"kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh\") pod \"network-check-target-tn9c4\" (UID: \"a1f54627-ed53-4b78-85cb-f07dd7bc3869\") " pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:27.298599 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:27.298575 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:49:27.298599 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:27.298595 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:49:27.298718 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:27.298608 2579 projected.go:194] Error preparing data for projected volume kube-api-access-8nvvh for pod openshift-network-diagnostics/network-check-target-tn9c4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:27.298718 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:27.298665 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh podName:a1f54627-ed53-4b78-85cb-f07dd7bc3869 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:31.298643193 +0000 UTC m=+10.247734466 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8nvvh" (UniqueName: "kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh") pod "network-check-target-tn9c4" (UID: "a1f54627-ed53-4b78-85cb-f07dd7bc3869") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:27.706777 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:27.706092 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:27.706777 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:27.706209 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:27.706777 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:27.706632 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:27.706777 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:27.706731 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:29.705112 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:29.705065 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:29.705112 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:29.705095 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:29.705554 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:29.705202 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:29.705554 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:29.705359 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:31.228985 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:31.228927 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:31.229440 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:31.229096 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:31.229440 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:31.229177 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs podName:202b33ea-296c-464b-8ee5-774d048859c2 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:39.229154244 +0000 UTC m=+18.178245522 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs") pod "network-metrics-daemon-lwrh2" (UID: "202b33ea-296c-464b-8ee5-774d048859c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:31.329758 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:31.329719 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvvh\" (UniqueName: \"kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh\") pod \"network-check-target-tn9c4\" (UID: \"a1f54627-ed53-4b78-85cb-f07dd7bc3869\") " pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:31.329913 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:31.329891 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:49:31.329999 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:31.329918 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:49:31.329999 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:31.329931 2579 projected.go:194] Error preparing data for projected volume kube-api-access-8nvvh for pod openshift-network-diagnostics/network-check-target-tn9c4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:31.330102 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:31.330010 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh podName:a1f54627-ed53-4b78-85cb-f07dd7bc3869 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:39.329990865 +0000 UTC m=+18.279082142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8nvvh" (UniqueName: "kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh") pod "network-check-target-tn9c4" (UID: "a1f54627-ed53-4b78-85cb-f07dd7bc3869") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:31.706187 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:31.706108 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:31.706187 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:31.706135 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:31.706420 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:31.706231 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:31.707325 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:31.707276 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:33.705456 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:33.705427 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:33.705456 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:33.705442 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:33.705883 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:33.705538 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:33.705883 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:33.705674 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:35.705413 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:35.705377 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:35.705852 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:35.705422 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:35.705852 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:35.705481 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:35.705852 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:35.705623 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:37.705339 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:37.705305 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:37.705764 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:37.705305 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:37.705764 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:37.705437 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:37.705764 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:37.705505 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:39.286701 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:39.286660 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:39.287116 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:39.286836 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:39.287116 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:39.286904 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs podName:202b33ea-296c-464b-8ee5-774d048859c2 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:55.28688841 +0000 UTC m=+34.235979687 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs") pod "network-metrics-daemon-lwrh2" (UID: "202b33ea-296c-464b-8ee5-774d048859c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:39.387876 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:39.387836 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvvh\" (UniqueName: \"kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh\") pod \"network-check-target-tn9c4\" (UID: \"a1f54627-ed53-4b78-85cb-f07dd7bc3869\") " pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:39.388096 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:39.388005 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:49:39.388096 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:39.388030 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:49:39.388096 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:39.388043 2579 projected.go:194] Error preparing data for projected volume kube-api-access-8nvvh for pod openshift-network-diagnostics/network-check-target-tn9c4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:39.388268 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:39.388102 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh podName:a1f54627-ed53-4b78-85cb-f07dd7bc3869 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:55.388083964 +0000 UTC m=+34.337175243 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8nvvh" (UniqueName: "kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh") pod "network-check-target-tn9c4" (UID: "a1f54627-ed53-4b78-85cb-f07dd7bc3869") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:39.704949 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:39.704868 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:39.705145 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:39.704872 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:39.705145 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:39.705010 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:39.705145 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:39.705095 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:41.706434 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:41.706345 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:41.706929 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:41.706430 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:41.706929 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:41.706517 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:41.706929 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:41.706636 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:42.802922 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.802499 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" event={"ID":"58381b68b4623b46ab7cd9e4f4303667","Type":"ContainerStarted","Data":"72dde16f60a13c12b4ffc7f97749cfea0e1ea26be06434bb19eacafa19c6dbaa"} Apr 21 01:49:42.804048 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.804019 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ndnpw" event={"ID":"a45eec20-9193-41a2-a65b-20a0623f23d5","Type":"ContainerStarted","Data":"4bffb8716a3f2a4378ee6580a6d21602e5a93b034c7fa3ef531b0a72c0d5e338"} Apr 21 01:49:42.806888 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.806869 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-acl-logging/0.log" Apr 21 01:49:42.807332 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.807310 2579 generic.go:358] "Generic (PLEG): container finished" podID="4f79c179-ef2b-42fc-ae2f-cbf567b17a05" containerID="65b6ed9f963740cc4597a5984d225898c77ea2eb3d838c0884de741505d3da39" exitCode=1 Apr 21 01:49:42.807431 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.807372 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" event={"ID":"4f79c179-ef2b-42fc-ae2f-cbf567b17a05","Type":"ContainerStarted","Data":"475755fa006c2b87880f051859340bdbb6d25dd0f2dedc4026d9ec559ab238f1"} Apr 21 01:49:42.807431 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.807394 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" event={"ID":"4f79c179-ef2b-42fc-ae2f-cbf567b17a05","Type":"ContainerStarted","Data":"37fe291c0f5c2c9de9f3661ccead2f9dadccdb55b9a91d65a40c27f0166cfa75"} Apr 21 01:49:42.807431 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.807403 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" event={"ID":"4f79c179-ef2b-42fc-ae2f-cbf567b17a05","Type":"ContainerStarted","Data":"76887253cd537fe846a1127e1e99df24100d7c2abad8d84bca13d5694dcac785"} Apr 21 01:49:42.807431 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.807411 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" event={"ID":"4f79c179-ef2b-42fc-ae2f-cbf567b17a05","Type":"ContainerStarted","Data":"ac53d6819e361a150b915390cd2032b316feb61a32df303ae404dc6f1a167211"} Apr 21 01:49:42.807431 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.807420 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" event={"ID":"4f79c179-ef2b-42fc-ae2f-cbf567b17a05","Type":"ContainerDied","Data":"65b6ed9f963740cc4597a5984d225898c77ea2eb3d838c0884de741505d3da39"} Apr 21 01:49:42.807431 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.807430 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" event={"ID":"4f79c179-ef2b-42fc-ae2f-cbf567b17a05","Type":"ContainerStarted","Data":"f3ff1467aac3d24eaae6c80a755f6736938877771924ba32da5313b304a30844"} Apr 21 01:49:42.808682 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.808658 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" event={"ID":"62c83f40-5302-4d3d-a41f-36cd9302f7f0","Type":"ContainerStarted","Data":"30f55a29ad4a9a25151865b1384e3c12f2f20c31d6f2eee74c10d6af8a983fc5"} Apr 21 01:49:42.810166 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.810145 2579 generic.go:358] "Generic (PLEG): container finished" podID="16bf952c-6936-44eb-a262-16dac395d351" containerID="ec698247584425a71a05244c6b44f5903c01b5e938d523c483552bcb066947d4" exitCode=0 Apr 21 01:49:42.810263 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.810172 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljkm7" event={"ID":"16bf952c-6936-44eb-a262-16dac395d351","Type":"ContainerDied","Data":"ec698247584425a71a05244c6b44f5903c01b5e938d523c483552bcb066947d4"} Apr 21 01:49:42.813805 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.813774 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jvwm9" event={"ID":"e0901c2f-3720-4876-9776-5d298a25b8a3","Type":"ContainerStarted","Data":"924cd55eb9abf1ba1a20780b7ad03c174cf91c526df0aacdefcfd1b5e10e3091"} Apr 21 01:49:42.815543 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.815518 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cxcst" event={"ID":"18adaff4-c987-4e36-8131-4023c56e79c0","Type":"ContainerStarted","Data":"d53b39060492ae1d91436bb88e1c7a0484bf71cda39d2bb17a21a45f904fdda2"} Apr 21 01:49:42.815795 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.815761 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-42.ec2.internal" podStartSLOduration=19.815749979 podStartE2EDuration="19.815749979s" podCreationTimestamp="2026-04-21 01:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:49:42.815607807 +0000 UTC m=+21.764699102" watchObservedRunningTime="2026-04-21 01:49:42.815749979 +0000 UTC m=+21.764841273" Apr 21 01:49:42.819380 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.819353 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" event={"ID":"f43369dc-2b67-4f61-bd42-5f76536f5501","Type":"ContainerStarted","Data":"210ce164d373665c4b045580c3a622e704c9c49b646772d2f9eb536653289277"} Apr 21 01:49:42.820768 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.820749 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wv5c7" event={"ID":"69b70ba3-d144-4882-86ae-cb9a2e4391a4","Type":"ContainerStarted","Data":"9e0fcb514fb98c662b723e90432635e9caca13cabd761867688573d6aaff1aa7"} Apr 21 01:49:42.844987 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.844923 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jvwm9" podStartSLOduration=4.836566346 podStartE2EDuration="21.844911004s" podCreationTimestamp="2026-04-21 01:49:21 +0000 UTC" firstStartedPulling="2026-04-21 01:49:24.402760654 +0000 UTC m=+3.351851928" lastFinishedPulling="2026-04-21 01:49:41.411105313 +0000 UTC m=+20.360196586" observedRunningTime="2026-04-21 01:49:42.844303676 +0000 UTC m=+21.793394971" watchObservedRunningTime="2026-04-21 01:49:42.844911004 +0000 UTC m=+21.794002299" Apr 21 01:49:42.856955 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.856915 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cxcst" podStartSLOduration=3.51108916 podStartE2EDuration="20.856903399s" podCreationTimestamp="2026-04-21 01:49:22 +0000 UTC" firstStartedPulling="2026-04-21 01:49:24.433117099 +0000 UTC m=+3.382208375" lastFinishedPulling="2026-04-21 01:49:41.778931328 +0000 UTC m=+20.728022614" observedRunningTime="2026-04-21 01:49:42.856708268 +0000 UTC m=+21.805799565" watchObservedRunningTime="2026-04-21 01:49:42.856903399 +0000 UTC m=+21.805994694" Apr 21 01:49:42.868120 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.868078 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ndnpw" podStartSLOduration=3.566688344 podStartE2EDuration="20.868064591s" podCreationTimestamp="2026-04-21 01:49:22 +0000 UTC" firstStartedPulling="2026-04-21 01:49:24.438068996 +0000 UTC m=+3.387160273" lastFinishedPulling="2026-04-21 01:49:41.739445238 +0000 UTC m=+20.688536520" observedRunningTime="2026-04-21 01:49:42.867996156 +0000 UTC m=+21.817087451" watchObservedRunningTime="2026-04-21 01:49:42.868064591 +0000 UTC m=+21.817155887" Apr 21 01:49:42.880567 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.880531 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-v7wv4" podStartSLOduration=3.548187095 podStartE2EDuration="20.880519411s" podCreationTimestamp="2026-04-21 01:49:22 +0000 UTC" firstStartedPulling="2026-04-21 01:49:24.40713244 +0000 UTC m=+3.356223724" lastFinishedPulling="2026-04-21 01:49:41.739464756 +0000 UTC m=+20.688556040" observedRunningTime="2026-04-21 01:49:42.880313567 +0000 UTC m=+21.829404862" watchObservedRunningTime="2026-04-21 01:49:42.880519411 +0000 UTC m=+21.829610705" Apr 21 01:49:42.891367 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:42.891322 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wv5c7" podStartSLOduration=9.250308484 podStartE2EDuration="21.891309932s" podCreationTimestamp="2026-04-21 01:49:21 +0000 UTC" firstStartedPulling="2026-04-21 01:49:24.412470013 +0000 UTC m=+3.361561289" lastFinishedPulling="2026-04-21 01:49:37.053471464 +0000 UTC m=+16.002562737" observedRunningTime="2026-04-21 01:49:42.8910549 +0000 UTC m=+21.840146196" watchObservedRunningTime="2026-04-21 01:49:42.891309932 +0000 UTC m=+21.840401227" Apr 21 01:49:43.259988 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:43.259945 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 01:49:43.617469 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:43.617183 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T01:49:43.259965551Z","UUID":"f68b8e6f-dd1d-4415-b65e-6d149e870ea2","Handler":null,"Name":"","Endpoint":""} Apr 21 01:49:43.620075 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:43.620051 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 01:49:43.620213 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:43.620082 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 01:49:43.707010 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:43.705714 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:43.707010 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:43.705838 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:43.707010 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:43.705721 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:43.707010 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:43.706288 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:43.826862 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:43.826816 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2h8jz" event={"ID":"286bb4a3-718c-4b23-ae28-f66678e8128c","Type":"ContainerStarted","Data":"7ee44aecac6f1ec8752f1e6da199425977e2ad47683d8d8c4693f5a94ec7b2c4"} Apr 21 01:49:43.830740 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:43.830692 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" event={"ID":"62c83f40-5302-4d3d-a41f-36cd9302f7f0","Type":"ContainerStarted","Data":"1089799849be2281d307d5383e96b3272127439b622916eb773ab05936821e54"} Apr 21 01:49:43.850872 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:43.850810 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2h8jz" podStartSLOduration=4.8519523719999995 podStartE2EDuration="21.850792245s" podCreationTimestamp="2026-04-21 01:49:22 +0000 UTC" firstStartedPulling="2026-04-21 01:49:24.412295302 +0000 UTC m=+3.361386575" lastFinishedPulling="2026-04-21 01:49:41.41113516 +0000 UTC m=+20.360226448" observedRunningTime="2026-04-21 01:49:43.849964648 +0000 UTC m=+22.799055944" watchObservedRunningTime="2026-04-21 01:49:43.850792245 +0000 UTC m=+22.799883541" Apr 21 01:49:44.835807 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:44.835777 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-acl-logging/0.log" Apr 21 01:49:44.836296 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:44.836269 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" event={"ID":"4f79c179-ef2b-42fc-ae2f-cbf567b17a05","Type":"ContainerStarted","Data":"3de5daec50e804bb18805decfa385828694ee0985303113a4bc05b8358fe31f2"} Apr 21 01:49:44.838410 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:44.838381 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" event={"ID":"62c83f40-5302-4d3d-a41f-36cd9302f7f0","Type":"ContainerStarted","Data":"d6c99b7d69605ba1def88586bb88556adccb452753772daceec2233f8f14a5e3"} Apr 21 01:49:44.854341 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:44.854288 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvfrr" podStartSLOduration=4.100533695 podStartE2EDuration="23.854274333s" podCreationTimestamp="2026-04-21 01:49:21 +0000 UTC" firstStartedPulling="2026-04-21 01:49:24.437994762 +0000 UTC m=+3.387086040" lastFinishedPulling="2026-04-21 01:49:44.191735393 +0000 UTC m=+23.140826678" observedRunningTime="2026-04-21 01:49:44.853876402 +0000 UTC m=+23.802967698" watchObservedRunningTime="2026-04-21 01:49:44.854274333 +0000 UTC m=+23.803365628" Apr 21 01:49:45.705442 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:45.705412 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:45.705656 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:45.705422 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:45.705656 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:45.705517 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:45.705656 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:45.705594 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:47.028417 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.028178 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jvwm9" Apr 21 01:49:47.029820 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.029631 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jvwm9" Apr 21 01:49:47.705592 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.705558 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:47.705592 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.705603 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:47.705799 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:47.705689 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:47.705834 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:47.705803 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:47.845229 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.845205 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-acl-logging/0.log" Apr 21 01:49:47.845576 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.845539 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" event={"ID":"4f79c179-ef2b-42fc-ae2f-cbf567b17a05","Type":"ContainerStarted","Data":"b0d8f2e5edd3e07c8238650559b646e50f87ba2868a185b792569343e7cb0bce"} Apr 21 01:49:47.846075 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.846056 2579 scope.go:117] "RemoveContainer" containerID="65b6ed9f963740cc4597a5984d225898c77ea2eb3d838c0884de741505d3da39" Apr 21 01:49:47.847339 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.846629 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:47.847339 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.846656 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:47.847339 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.846666 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:47.847498 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.847416 2579 generic.go:358] "Generic (PLEG): container finished" podID="16bf952c-6936-44eb-a262-16dac395d351" containerID="5115bc6224907d7edab74290e721935041a908f456acb6c640b0d2af473815a6" exitCode=0 Apr 21 01:49:47.847554 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.847508 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljkm7" event={"ID":"16bf952c-6936-44eb-a262-16dac395d351","Type":"ContainerDied","Data":"5115bc6224907d7edab74290e721935041a908f456acb6c640b0d2af473815a6"} Apr 21 01:49:47.862341 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.862322 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:47.862432 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:47.862395 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:49:48.840217 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:48.840137 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lwrh2"] Apr 21 01:49:48.840725 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:48.840288 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:48.840725 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:48.840415 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:48.842795 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:48.842773 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tn9c4"] Apr 21 01:49:48.842918 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:48.842905 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:48.843043 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:48.843021 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:48.853751 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:48.853732 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-acl-logging/0.log" Apr 21 01:49:48.854119 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:48.854097 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" event={"ID":"4f79c179-ef2b-42fc-ae2f-cbf567b17a05","Type":"ContainerStarted","Data":"fe771d7bdf43a9dd2067502f900ac71099f0860970ac047bb001f25eb50a7c46"} Apr 21 01:49:48.858258 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:48.858210 2579 generic.go:358] "Generic (PLEG): container finished" podID="16bf952c-6936-44eb-a262-16dac395d351" containerID="7d9cae73bebcc7b364f99062c81c5c4b83ff4a8df3d69b8c5df38d9af691b4b2" exitCode=0 Apr 21 01:49:48.858370 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:48.858263 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljkm7" event={"ID":"16bf952c-6936-44eb-a262-16dac395d351","Type":"ContainerDied","Data":"7d9cae73bebcc7b364f99062c81c5c4b83ff4a8df3d69b8c5df38d9af691b4b2"} Apr 21 01:49:48.879562 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:48.879508 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" podStartSLOduration=10.473472415 podStartE2EDuration="27.879488051s" podCreationTimestamp="2026-04-21 01:49:21 +0000 UTC" firstStartedPulling="2026-04-21 01:49:24.437992745 +0000 UTC m=+3.387084030" lastFinishedPulling="2026-04-21 01:49:41.844008379 +0000 UTC m=+20.793099666" observedRunningTime="2026-04-21 01:49:48.878028153 +0000 UTC m=+27.827119462" watchObservedRunningTime="2026-04-21 01:49:48.879488051 +0000 UTC m=+27.828579346" Apr 21 01:49:49.862441 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:49.862132 2579 generic.go:358] "Generic (PLEG): container finished" podID="16bf952c-6936-44eb-a262-16dac395d351" containerID="6b623fb90855f320237818f01df65a1d67e75e2c1e45f8e9a2334f1a1e3cfb9d" exitCode=0 Apr 21 01:49:49.862784 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:49.862180 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljkm7" event={"ID":"16bf952c-6936-44eb-a262-16dac395d351","Type":"ContainerDied","Data":"6b623fb90855f320237818f01df65a1d67e75e2c1e45f8e9a2334f1a1e3cfb9d"} Apr 21 01:49:50.705729 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:50.705652 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:50.705729 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:50.705687 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:50.705940 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:50.705784 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:50.706021 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:50.705934 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:52.705510 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:52.705475 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:52.706292 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:52.705475 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:52.706292 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:52.705599 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:52.706292 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:52.705724 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:52.737534 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:52.737501 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jvwm9" Apr 21 01:49:52.737694 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:52.737655 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 01:49:52.738145 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:52.738125 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jvwm9" Apr 21 01:49:54.705295 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.705248 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:54.705721 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.705255 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:54.705721 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:54.705363 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tn9c4" podUID="a1f54627-ed53-4b78-85cb-f07dd7bc3869" Apr 21 01:49:54.705721 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:54.705490 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:49:54.850566 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.850493 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-42.ec2.internal" event="NodeReady" Apr 21 01:49:54.850729 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.850635 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 01:49:54.881664 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.881629 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z"] Apr 21 01:49:54.887070 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.887032 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-766c6fff44-lht95"] Apr 21 01:49:54.887225 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.887176 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" Apr 21 01:49:54.889941 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.889916 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 01:49:54.889941 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.889940 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5"] Apr 21 01:49:54.890128 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.889967 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 01:49:54.890178 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.890133 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 01:49:54.890347 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.890319 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 01:49:54.890420 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.890355 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-khg59\"" Apr 21 01:49:54.893866 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.893846 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2"] Apr 21 01:49:54.894251 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.894232 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:54.896954 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.896907 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z"] Apr 21 01:49:54.897113 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.897032 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:49:54.897361 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.897341 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:54.900917 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.900899 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hdqjq\"" Apr 21 01:49:54.901077 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.901058 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 01:49:54.901193 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.901178 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 01:49:54.901375 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.901356 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 01:49:54.902483 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.901841 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5"] Apr 21 01:49:54.902483 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.902020 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 01:49:54.902483 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.902150 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 01:49:54.902483 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.902447 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 01:49:54.904301 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.902723 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 01:49:54.904301 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.902881 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 01:49:54.904301 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.903031 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2"] Apr 21 01:49:54.904301 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.903629 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-766c6fff44-lht95"] Apr 21 01:49:54.904301 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.903771 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r2p22"] Apr 21 01:49:54.906599 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.906582 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 01:49:54.909095 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.909065 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:49:54.912123 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.911274 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 01:49:54.912123 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.911466 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 01:49:54.912123 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.911595 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbqdn\"" Apr 21 01:49:54.912123 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.911676 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 01:49:54.921748 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.921726 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r2p22"] Apr 21 01:49:54.999494 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:54.999458 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hwqnv"] Apr 21 01:49:55.003174 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.003144 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.004463 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004440 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-ca-trust-extracted\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.004609 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004497 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6b2b2252-03b2-40ad-97d4-ed7404fbbf07-klusterlet-config\") pod \"klusterlet-addon-workmgr-684679d89d-ctrk5\" (UID: \"6b2b2252-03b2-40ad-97d4-ed7404fbbf07\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:49:55.004609 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004524 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/119fa989-6388-4b2a-bbb2-a93810ea36d8-ca\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.004609 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004551 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prbkz\" (UniqueName: \"kubernetes.io/projected/119fa989-6388-4b2a-bbb2-a93810ea36d8-kube-api-access-prbkz\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.004791 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004632 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/119fa989-6388-4b2a-bbb2-a93810ea36d8-hub\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.004791 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004658 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5b86\" (UniqueName: \"kubernetes.io/projected/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-kube-api-access-z5b86\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:49:55.004791 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004694 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:49:55.004791 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004740 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-installation-pull-secrets\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.004791 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004765 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1b23055c-1b97-4d29-9da2-753ff124fcee-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bc48f48b9-kh82z\" (UID: \"1b23055c-1b97-4d29-9da2-753ff124fcee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" Apr 21 01:49:55.005036 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004819 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-certificates\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.005036 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004845 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/119fa989-6388-4b2a-bbb2-a93810ea36d8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.005036 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004875 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-bound-sa-token\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.005036 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004904 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-image-registry-private-configuration\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.005036 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004928 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/119fa989-6388-4b2a-bbb2-a93810ea36d8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.005036 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.004957 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.005359 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.005109 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-trusted-ca\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.005359 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.005144 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjrz4\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-kube-api-access-gjrz4\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.005359 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.005177 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b2b2252-03b2-40ad-97d4-ed7404fbbf07-tmp\") pod \"klusterlet-addon-workmgr-684679d89d-ctrk5\" (UID: \"6b2b2252-03b2-40ad-97d4-ed7404fbbf07\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:49:55.005359 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.005203 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnwjb\" (UniqueName: \"kubernetes.io/projected/1b23055c-1b97-4d29-9da2-753ff124fcee-kube-api-access-xnwjb\") pod \"managed-serviceaccount-addon-agent-7bc48f48b9-kh82z\" (UID: \"1b23055c-1b97-4d29-9da2-753ff124fcee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" Apr 21 01:49:55.005359 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.005231 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvprf\" (UniqueName: \"kubernetes.io/projected/6b2b2252-03b2-40ad-97d4-ed7404fbbf07-kube-api-access-zvprf\") pod \"klusterlet-addon-workmgr-684679d89d-ctrk5\" (UID: \"6b2b2252-03b2-40ad-97d4-ed7404fbbf07\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:49:55.005359 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.005254 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/119fa989-6388-4b2a-bbb2-a93810ea36d8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.005794 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.005696 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vd9s5\"" Apr 21 01:49:55.005892 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.005876 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 01:49:55.006169 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.006152 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 01:49:55.009789 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.009769 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hwqnv"] Apr 21 01:49:55.106279 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106190 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:49:55.106279 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106242 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74bh2\" (UniqueName: \"kubernetes.io/projected/1bdc049e-4435-4a8a-a927-c21e9eb190a6-kube-api-access-74bh2\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.106486 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106296 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-installation-pull-secrets\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.106486 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1b23055c-1b97-4d29-9da2-753ff124fcee-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bc48f48b9-kh82z\" (UID: \"1b23055c-1b97-4d29-9da2-753ff124fcee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" Apr 21 01:49:55.106486 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106345 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-certificates\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.106486 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.106352 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:49:55.106486 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/119fa989-6388-4b2a-bbb2-a93810ea36d8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.106486 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106396 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-bound-sa-token\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.106486 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.106418 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert podName:70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc nodeName:}" failed. No retries permitted until 2026-04-21 01:49:55.606396255 +0000 UTC m=+34.555487541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert") pod "ingress-canary-r2p22" (UID: "70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc") : secret "canary-serving-cert" not found Apr 21 01:49:55.106486 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106458 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-image-registry-private-configuration\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.106486 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106487 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/119fa989-6388-4b2a-bbb2-a93810ea36d8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.106826 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106515 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.106826 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-trusted-ca\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.106826 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjrz4\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-kube-api-access-gjrz4\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.106826 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106606 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b2b2252-03b2-40ad-97d4-ed7404fbbf07-tmp\") pod \"klusterlet-addon-workmgr-684679d89d-ctrk5\" (UID: \"6b2b2252-03b2-40ad-97d4-ed7404fbbf07\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:49:55.106826 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106632 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnwjb\" (UniqueName: \"kubernetes.io/projected/1b23055c-1b97-4d29-9da2-753ff124fcee-kube-api-access-xnwjb\") pod \"managed-serviceaccount-addon-agent-7bc48f48b9-kh82z\" (UID: \"1b23055c-1b97-4d29-9da2-753ff124fcee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" Apr 21 01:49:55.106826 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106660 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvprf\" (UniqueName: \"kubernetes.io/projected/6b2b2252-03b2-40ad-97d4-ed7404fbbf07-kube-api-access-zvprf\") pod \"klusterlet-addon-workmgr-684679d89d-ctrk5\" (UID: \"6b2b2252-03b2-40ad-97d4-ed7404fbbf07\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:49:55.106826 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106680 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/119fa989-6388-4b2a-bbb2-a93810ea36d8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.106826 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106701 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-ca-trust-extracted\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.106826 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106740 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6b2b2252-03b2-40ad-97d4-ed7404fbbf07-klusterlet-config\") pod \"klusterlet-addon-workmgr-684679d89d-ctrk5\" (UID: \"6b2b2252-03b2-40ad-97d4-ed7404fbbf07\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:49:55.106826 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106762 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/119fa989-6388-4b2a-bbb2-a93810ea36d8-ca\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.106826 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106784 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prbkz\" (UniqueName: \"kubernetes.io/projected/119fa989-6388-4b2a-bbb2-a93810ea36d8-kube-api-access-prbkz\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.106826 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106811 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/119fa989-6388-4b2a-bbb2-a93810ea36d8-hub\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.107365 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106836 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5b86\" (UniqueName: \"kubernetes.io/projected/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-kube-api-access-z5b86\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:49:55.107365 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106859 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bdc049e-4435-4a8a-a927-c21e9eb190a6-config-volume\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.107365 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106874 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bdc049e-4435-4a8a-a927-c21e9eb190a6-tmp-dir\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.107365 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.106893 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.107568 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.107482 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b2b2252-03b2-40ad-97d4-ed7404fbbf07-tmp\") pod \"klusterlet-addon-workmgr-684679d89d-ctrk5\" (UID: \"6b2b2252-03b2-40ad-97d4-ed7404fbbf07\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:49:55.107568 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.107488 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-ca-trust-extracted\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.107697 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.107678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-certificates\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.108333 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.108025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-trusted-ca\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.108880 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.108826 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:49:55.109125 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.109110 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-766c6fff44-lht95: secret "image-registry-tls" not found Apr 21 01:49:55.109355 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.109321 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls podName:2a74ac67-9fe5-4614-91e8-e0d09c4332fb nodeName:}" failed. No retries permitted until 2026-04-21 01:49:55.609303369 +0000 UTC m=+34.558394646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls") pod "image-registry-766c6fff44-lht95" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb") : secret "image-registry-tls" not found Apr 21 01:49:55.109492 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.109029 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/119fa989-6388-4b2a-bbb2-a93810ea36d8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.111707 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.111685 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/119fa989-6388-4b2a-bbb2-a93810ea36d8-ca\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.111799 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.111717 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-image-registry-private-configuration\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.111949 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.111923 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6b2b2252-03b2-40ad-97d4-ed7404fbbf07-klusterlet-config\") pod \"klusterlet-addon-workmgr-684679d89d-ctrk5\" (UID: \"6b2b2252-03b2-40ad-97d4-ed7404fbbf07\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:49:55.112180 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.112157 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-installation-pull-secrets\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.112271 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.112249 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/119fa989-6388-4b2a-bbb2-a93810ea36d8-hub\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.119057 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.113636 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1b23055c-1b97-4d29-9da2-753ff124fcee-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bc48f48b9-kh82z\" (UID: \"1b23055c-1b97-4d29-9da2-753ff124fcee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" Apr 21 01:49:55.119057 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.113663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/119fa989-6388-4b2a-bbb2-a93810ea36d8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.119057 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.113946 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/119fa989-6388-4b2a-bbb2-a93810ea36d8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.119057 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.116600 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-bound-sa-token\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.119057 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.117018 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prbkz\" (UniqueName: \"kubernetes.io/projected/119fa989-6388-4b2a-bbb2-a93810ea36d8-kube-api-access-prbkz\") pod \"cluster-proxy-proxy-agent-7b564b976c-jrvh2\" (UID: \"119fa989-6388-4b2a-bbb2-a93810ea36d8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.119057 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.117162 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvprf\" (UniqueName: \"kubernetes.io/projected/6b2b2252-03b2-40ad-97d4-ed7404fbbf07-kube-api-access-zvprf\") pod \"klusterlet-addon-workmgr-684679d89d-ctrk5\" (UID: \"6b2b2252-03b2-40ad-97d4-ed7404fbbf07\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:49:55.119057 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.117209 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjrz4\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-kube-api-access-gjrz4\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.119057 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.118108 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnwjb\" (UniqueName: \"kubernetes.io/projected/1b23055c-1b97-4d29-9da2-753ff124fcee-kube-api-access-xnwjb\") pod \"managed-serviceaccount-addon-agent-7bc48f48b9-kh82z\" (UID: \"1b23055c-1b97-4d29-9da2-753ff124fcee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" Apr 21 01:49:55.119057 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.118220 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5b86\" (UniqueName: \"kubernetes.io/projected/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-kube-api-access-z5b86\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:49:55.207474 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.207442 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" Apr 21 01:49:55.207650 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.207526 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.207650 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.207574 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74bh2\" (UniqueName: \"kubernetes.io/projected/1bdc049e-4435-4a8a-a927-c21e9eb190a6-kube-api-access-74bh2\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.207763 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.207660 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:49:55.207763 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.207672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bdc049e-4435-4a8a-a927-c21e9eb190a6-config-volume\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.207763 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.207699 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bdc049e-4435-4a8a-a927-c21e9eb190a6-tmp-dir\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.207763 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.207750 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls podName:1bdc049e-4435-4a8a-a927-c21e9eb190a6 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:55.707733164 +0000 UTC m=+34.656824441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls") pod "dns-default-hwqnv" (UID: "1bdc049e-4435-4a8a-a927-c21e9eb190a6") : secret "dns-default-metrics-tls" not found Apr 21 01:49:55.208136 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.208085 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1bdc049e-4435-4a8a-a927-c21e9eb190a6-tmp-dir\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.208365 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.208349 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bdc049e-4435-4a8a-a927-c21e9eb190a6-config-volume\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.216315 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.216289 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74bh2\" (UniqueName: \"kubernetes.io/projected/1bdc049e-4435-4a8a-a927-c21e9eb190a6-kube-api-access-74bh2\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.225182 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.225159 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:49:55.234892 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.234871 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:49:55.308966 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.308915 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:55.309165 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.309069 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:55.309165 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.309141 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs podName:202b33ea-296c-464b-8ee5-774d048859c2 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:27.309120318 +0000 UTC m=+66.258211591 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs") pod "network-metrics-daemon-lwrh2" (UID: "202b33ea-296c-464b-8ee5-774d048859c2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:49:55.410487 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.410389 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvvh\" (UniqueName: \"kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh\") pod \"network-check-target-tn9c4\" (UID: \"a1f54627-ed53-4b78-85cb-f07dd7bc3869\") " pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:55.410634 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.410541 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:49:55.410634 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.410559 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:49:55.410634 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.410569 2579 projected.go:194] Error preparing data for projected volume kube-api-access-8nvvh for pod openshift-network-diagnostics/network-check-target-tn9c4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:55.410634 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.410627 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh podName:a1f54627-ed53-4b78-85cb-f07dd7bc3869 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:27.410613452 +0000 UTC m=+66.359704727 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8nvvh" (UniqueName: "kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh") pod "network-check-target-tn9c4" (UID: "a1f54627-ed53-4b78-85cb-f07dd7bc3869") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:49:55.611768 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.611540 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:49:55.611768 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.611812 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:55.611768 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.611819 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:49:55.613111 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.611881 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert podName:70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc nodeName:}" failed. No retries permitted until 2026-04-21 01:49:56.611865422 +0000 UTC m=+35.560956695 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert") pod "ingress-canary-r2p22" (UID: "70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc") : secret "canary-serving-cert" not found Apr 21 01:49:55.613111 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.611932 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:49:55.613111 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.611945 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-766c6fff44-lht95: secret "image-registry-tls" not found Apr 21 01:49:55.613111 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.612017 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls podName:2a74ac67-9fe5-4614-91e8-e0d09c4332fb nodeName:}" failed. No retries permitted until 2026-04-21 01:49:56.612001674 +0000 UTC m=+35.561092953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls") pod "image-registry-766c6fff44-lht95" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb") : secret "image-registry-tls" not found Apr 21 01:49:55.712888 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.712469 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:55.712888 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.712620 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:49:55.712888 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:55.712679 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls podName:1bdc049e-4435-4a8a-a927-c21e9eb190a6 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:56.712660915 +0000 UTC m=+35.661752193 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls") pod "dns-default-hwqnv" (UID: "1bdc049e-4435-4a8a-a927-c21e9eb190a6") : secret "dns-default-metrics-tls" not found Apr 21 01:49:55.744850 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.744818 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z"] Apr 21 01:49:55.747680 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.747535 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5"] Apr 21 01:49:55.756154 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.756130 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2"] Apr 21 01:49:55.833111 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:55.833074 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b23055c_1b97_4d29_9da2_753ff124fcee.slice/crio-a956c917d3674c1c39ba61b8090bc070d8b00bfcfe2988891341367aef8eaf7d WatchSource:0}: Error finding container a956c917d3674c1c39ba61b8090bc070d8b00bfcfe2988891341367aef8eaf7d: Status 404 returned error can't find the container with id a956c917d3674c1c39ba61b8090bc070d8b00bfcfe2988891341367aef8eaf7d Apr 21 01:49:55.833596 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:55.833575 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b2b2252_03b2_40ad_97d4_ed7404fbbf07.slice/crio-140fd0527af779a899f77061bb68e5ecd325c325ffc97cf5b4c83984a40048c6 WatchSource:0}: Error finding container 140fd0527af779a899f77061bb68e5ecd325c325ffc97cf5b4c83984a40048c6: Status 404 returned error can't find the container with id 140fd0527af779a899f77061bb68e5ecd325c325ffc97cf5b4c83984a40048c6 Apr 21 01:49:55.834163 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:49:55.834141 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod119fa989_6388_4b2a_bbb2_a93810ea36d8.slice/crio-dc5a9f1c55b9d85c5f4afb93a6bf4cea9fee42fac3c4c11d3f944dfc4000f9f1 WatchSource:0}: Error finding container dc5a9f1c55b9d85c5f4afb93a6bf4cea9fee42fac3c4c11d3f944dfc4000f9f1: Status 404 returned error can't find the container with id dc5a9f1c55b9d85c5f4afb93a6bf4cea9fee42fac3c4c11d3f944dfc4000f9f1 Apr 21 01:49:55.873749 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.873726 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" event={"ID":"1b23055c-1b97-4d29-9da2-753ff124fcee","Type":"ContainerStarted","Data":"a956c917d3674c1c39ba61b8090bc070d8b00bfcfe2988891341367aef8eaf7d"} Apr 21 01:49:55.874701 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.874676 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" event={"ID":"6b2b2252-03b2-40ad-97d4-ed7404fbbf07","Type":"ContainerStarted","Data":"140fd0527af779a899f77061bb68e5ecd325c325ffc97cf5b4c83984a40048c6"} Apr 21 01:49:55.875513 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:55.875491 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" event={"ID":"119fa989-6388-4b2a-bbb2-a93810ea36d8","Type":"ContainerStarted","Data":"dc5a9f1c55b9d85c5f4afb93a6bf4cea9fee42fac3c4c11d3f944dfc4000f9f1"} Apr 21 01:49:56.619956 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:56.619914 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:49:56.625832 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:56.625801 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:49:56.626016 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:56.625896 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert podName:70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc nodeName:}" failed. No retries permitted until 2026-04-21 01:49:58.625867914 +0000 UTC m=+37.574959202 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert") pod "ingress-canary-r2p22" (UID: "70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc") : secret "canary-serving-cert" not found Apr 21 01:49:56.626323 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:56.620431 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:56.626618 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:56.626600 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:49:56.626668 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:56.626628 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-766c6fff44-lht95: secret "image-registry-tls" not found Apr 21 01:49:56.626706 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:56.626677 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls podName:2a74ac67-9fe5-4614-91e8-e0d09c4332fb nodeName:}" failed. No retries permitted until 2026-04-21 01:49:58.626661882 +0000 UTC m=+37.575753168 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls") pod "image-registry-766c6fff44-lht95" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb") : secret "image-registry-tls" not found Apr 21 01:49:56.705930 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:56.705892 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:49:56.706795 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:56.705892 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:49:56.710173 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:56.710135 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 01:49:56.710396 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:56.710377 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-68xr4\"" Apr 21 01:49:56.710621 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:56.710606 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nb98s\"" Apr 21 01:49:56.711207 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:56.710779 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 01:49:56.711207 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:56.711050 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 01:49:56.726867 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:56.726847 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:56.727229 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:56.727029 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:49:56.727229 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:56.727096 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls podName:1bdc049e-4435-4a8a-a927-c21e9eb190a6 nodeName:}" failed. No retries permitted until 2026-04-21 01:49:58.727075928 +0000 UTC m=+37.676167216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls") pod "dns-default-hwqnv" (UID: "1bdc049e-4435-4a8a-a927-c21e9eb190a6") : secret "dns-default-metrics-tls" not found Apr 21 01:49:56.885753 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:56.884684 2579 generic.go:358] "Generic (PLEG): container finished" podID="16bf952c-6936-44eb-a262-16dac395d351" containerID="d9a068545a9a30900265ac753855f3146f204e71e6d99b1411df2617fd6531e4" exitCode=0 Apr 21 01:49:56.885753 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:56.884745 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljkm7" event={"ID":"16bf952c-6936-44eb-a262-16dac395d351","Type":"ContainerDied","Data":"d9a068545a9a30900265ac753855f3146f204e71e6d99b1411df2617fd6531e4"} Apr 21 01:49:57.894330 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:57.894104 2579 generic.go:358] "Generic (PLEG): container finished" podID="16bf952c-6936-44eb-a262-16dac395d351" containerID="7e60867b79b42d93fcf5c4e2eec5f8c8c918cd44dfd635bc93ddec2bf8b67d46" exitCode=0 Apr 21 01:49:57.894762 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:57.894382 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljkm7" event={"ID":"16bf952c-6936-44eb-a262-16dac395d351","Type":"ContainerDied","Data":"7e60867b79b42d93fcf5c4e2eec5f8c8c918cd44dfd635bc93ddec2bf8b67d46"} Apr 21 01:49:58.642744 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:58.642651 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:49:58.642744 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:58.642732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:49:58.642961 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:58.642885 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:49:58.642961 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:58.642899 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-766c6fff44-lht95: secret "image-registry-tls" not found Apr 21 01:49:58.642961 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:58.642956 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls podName:2a74ac67-9fe5-4614-91e8-e0d09c4332fb nodeName:}" failed. No retries permitted until 2026-04-21 01:50:02.642936746 +0000 UTC m=+41.592028022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls") pod "image-registry-766c6fff44-lht95" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb") : secret "image-registry-tls" not found Apr 21 01:49:58.643397 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:58.643376 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:49:58.643471 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:58.643425 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert podName:70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc nodeName:}" failed. No retries permitted until 2026-04-21 01:50:02.643409532 +0000 UTC m=+41.592500809 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert") pod "ingress-canary-r2p22" (UID: "70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc") : secret "canary-serving-cert" not found Apr 21 01:49:58.743302 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:49:58.743264 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:49:58.743481 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:58.743424 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:49:58.743537 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:49:58.743489 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls podName:1bdc049e-4435-4a8a-a927-c21e9eb190a6 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:02.743472738 +0000 UTC m=+41.692564037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls") pod "dns-default-hwqnv" (UID: "1bdc049e-4435-4a8a-a927-c21e9eb190a6") : secret "dns-default-metrics-tls" not found Apr 21 01:50:02.676822 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:02.676783 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:50:02.677292 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:02.676869 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:50:02.677292 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:02.676939 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:50:02.677292 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:02.676961 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-766c6fff44-lht95: secret "image-registry-tls" not found Apr 21 01:50:02.677292 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:02.676999 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:02.677292 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:02.677033 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls podName:2a74ac67-9fe5-4614-91e8-e0d09c4332fb nodeName:}" failed. No retries permitted until 2026-04-21 01:50:10.677018362 +0000 UTC m=+49.626109639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls") pod "image-registry-766c6fff44-lht95" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb") : secret "image-registry-tls" not found Apr 21 01:50:02.677292 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:02.677046 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert podName:70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc nodeName:}" failed. No retries permitted until 2026-04-21 01:50:10.677040657 +0000 UTC m=+49.626131930 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert") pod "ingress-canary-r2p22" (UID: "70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc") : secret "canary-serving-cert" not found Apr 21 01:50:02.778014 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:02.777957 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:50:02.778121 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:02.778102 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:02.778173 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:02.778164 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls podName:1bdc049e-4435-4a8a-a927-c21e9eb190a6 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:10.778149562 +0000 UTC m=+49.727240835 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls") pod "dns-default-hwqnv" (UID: "1bdc049e-4435-4a8a-a927-c21e9eb190a6") : secret "dns-default-metrics-tls" not found Apr 21 01:50:02.908501 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:02.908471 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ljkm7" event={"ID":"16bf952c-6936-44eb-a262-16dac395d351","Type":"ContainerStarted","Data":"fa1b2a885a7c7cdd84e876fc3b546a37df5fa740cfea84679073b4682110ff98"} Apr 21 01:50:02.909783 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:02.909758 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" event={"ID":"1b23055c-1b97-4d29-9da2-753ff124fcee","Type":"ContainerStarted","Data":"f524b605cef6a8a839d8f1ff67b4b8ace555714f64d60cf603fcb316a356b053"} Apr 21 01:50:02.910981 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:02.910945 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" event={"ID":"6b2b2252-03b2-40ad-97d4-ed7404fbbf07","Type":"ContainerStarted","Data":"c983fd338d3833e6457a4388a7ddb728788de289fbb0fdfe61f0cbe3e0dd6e58"} Apr 21 01:50:02.911099 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:02.911084 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:50:02.912225 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:02.912205 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" event={"ID":"119fa989-6388-4b2a-bbb2-a93810ea36d8","Type":"ContainerStarted","Data":"2b3075cb28ce48dd4b9dc6a0f426a441c64d2b28c466826418f8e94a8c77104b"} Apr 21 01:50:02.913055 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:02.913036 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:50:02.935644 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:02.935546 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ljkm7" podStartSLOduration=10.501526549 podStartE2EDuration="41.935528483s" podCreationTimestamp="2026-04-21 01:49:21 +0000 UTC" firstStartedPulling="2026-04-21 01:49:24.433183199 +0000 UTC m=+3.382274485" lastFinishedPulling="2026-04-21 01:49:55.867185133 +0000 UTC m=+34.816276419" observedRunningTime="2026-04-21 01:50:02.929924358 +0000 UTC m=+41.879015652" watchObservedRunningTime="2026-04-21 01:50:02.935528483 +0000 UTC m=+41.884619759" Apr 21 01:50:02.947043 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:02.946993 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" podStartSLOduration=35.827036385 podStartE2EDuration="41.946958998s" podCreationTimestamp="2026-04-21 01:49:21 +0000 UTC" firstStartedPulling="2026-04-21 01:49:55.843488129 +0000 UTC m=+34.792579403" lastFinishedPulling="2026-04-21 01:50:01.963410739 +0000 UTC m=+40.912502016" observedRunningTime="2026-04-21 01:50:02.946935163 +0000 UTC m=+41.896026456" watchObservedRunningTime="2026-04-21 01:50:02.946958998 +0000 UTC m=+41.896050294" Apr 21 01:50:02.966357 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:02.966316 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" podStartSLOduration=35.862174871 podStartE2EDuration="41.966305227s" podCreationTimestamp="2026-04-21 01:49:21 +0000 UTC" firstStartedPulling="2026-04-21 01:49:55.843662584 +0000 UTC m=+34.792753858" lastFinishedPulling="2026-04-21 01:50:01.947792936 +0000 UTC m=+40.896884214" observedRunningTime="2026-04-21 01:50:02.965898763 +0000 UTC m=+41.914990058" watchObservedRunningTime="2026-04-21 01:50:02.966305227 +0000 UTC m=+41.915396521" Apr 21 01:50:03.378590 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.378556 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9rpdw"] Apr 21 01:50:03.399431 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.399406 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9rpdw"] Apr 21 01:50:03.399612 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.399513 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9rpdw" Apr 21 01:50:03.402618 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.402596 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 01:50:03.485401 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.485372 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ccbf8a05-33ed-4417-b280-79fda195d53d-kubelet-config\") pod \"global-pull-secret-syncer-9rpdw\" (UID: \"ccbf8a05-33ed-4417-b280-79fda195d53d\") " pod="kube-system/global-pull-secret-syncer-9rpdw" Apr 21 01:50:03.485540 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.485406 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ccbf8a05-33ed-4417-b280-79fda195d53d-original-pull-secret\") pod \"global-pull-secret-syncer-9rpdw\" (UID: \"ccbf8a05-33ed-4417-b280-79fda195d53d\") " pod="kube-system/global-pull-secret-syncer-9rpdw" Apr 21 01:50:03.485540 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.485509 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ccbf8a05-33ed-4417-b280-79fda195d53d-dbus\") pod \"global-pull-secret-syncer-9rpdw\" (UID: \"ccbf8a05-33ed-4417-b280-79fda195d53d\") " pod="kube-system/global-pull-secret-syncer-9rpdw" Apr 21 01:50:03.586726 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.586697 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ccbf8a05-33ed-4417-b280-79fda195d53d-kubelet-config\") pod \"global-pull-secret-syncer-9rpdw\" (UID: \"ccbf8a05-33ed-4417-b280-79fda195d53d\") " pod="kube-system/global-pull-secret-syncer-9rpdw" Apr 21 01:50:03.586726 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.586730 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ccbf8a05-33ed-4417-b280-79fda195d53d-original-pull-secret\") pod \"global-pull-secret-syncer-9rpdw\" (UID: \"ccbf8a05-33ed-4417-b280-79fda195d53d\") " pod="kube-system/global-pull-secret-syncer-9rpdw" Apr 21 01:50:03.586918 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.586799 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ccbf8a05-33ed-4417-b280-79fda195d53d-dbus\") pod \"global-pull-secret-syncer-9rpdw\" (UID: \"ccbf8a05-33ed-4417-b280-79fda195d53d\") " pod="kube-system/global-pull-secret-syncer-9rpdw" Apr 21 01:50:03.586918 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.586834 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ccbf8a05-33ed-4417-b280-79fda195d53d-kubelet-config\") pod \"global-pull-secret-syncer-9rpdw\" (UID: \"ccbf8a05-33ed-4417-b280-79fda195d53d\") " pod="kube-system/global-pull-secret-syncer-9rpdw" Apr 21 01:50:03.587098 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.587081 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ccbf8a05-33ed-4417-b280-79fda195d53d-dbus\") pod \"global-pull-secret-syncer-9rpdw\" (UID: \"ccbf8a05-33ed-4417-b280-79fda195d53d\") " pod="kube-system/global-pull-secret-syncer-9rpdw" Apr 21 01:50:03.590954 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.590926 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ccbf8a05-33ed-4417-b280-79fda195d53d-original-pull-secret\") pod \"global-pull-secret-syncer-9rpdw\" (UID: \"ccbf8a05-33ed-4417-b280-79fda195d53d\") " pod="kube-system/global-pull-secret-syncer-9rpdw" Apr 21 01:50:03.707889 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.707817 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9rpdw" Apr 21 01:50:03.821257 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.821223 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9rpdw"] Apr 21 01:50:03.823857 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:50:03.823828 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccbf8a05_33ed_4417_b280_79fda195d53d.slice/crio-a56fb8c9564f28f57aa7ba802af216a00e5e3bed6c2f4c33244f12735958f396 WatchSource:0}: Error finding container a56fb8c9564f28f57aa7ba802af216a00e5e3bed6c2f4c33244f12735958f396: Status 404 returned error can't find the container with id a56fb8c9564f28f57aa7ba802af216a00e5e3bed6c2f4c33244f12735958f396 Apr 21 01:50:03.914847 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:03.914805 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9rpdw" event={"ID":"ccbf8a05-33ed-4417-b280-79fda195d53d","Type":"ContainerStarted","Data":"a56fb8c9564f28f57aa7ba802af216a00e5e3bed6c2f4c33244f12735958f396"} Apr 21 01:50:05.921229 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:05.921200 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" event={"ID":"119fa989-6388-4b2a-bbb2-a93810ea36d8","Type":"ContainerStarted","Data":"ae57e84c35116137214d0599672f2098051ea369c6bd0738ed26bc7e5e55ddba"} Apr 21 01:50:05.921229 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:05.921236 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" event={"ID":"119fa989-6388-4b2a-bbb2-a93810ea36d8","Type":"ContainerStarted","Data":"d372a439dc476cd7e36854d6f5383df16212e2083692924494717d56bffd30c0"} Apr 21 01:50:05.942297 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:05.942237 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" podStartSLOduration=35.000741288 podStartE2EDuration="44.942217093s" podCreationTimestamp="2026-04-21 01:49:21 +0000 UTC" firstStartedPulling="2026-04-21 01:49:55.843409212 +0000 UTC m=+34.792500485" lastFinishedPulling="2026-04-21 01:50:05.784885004 +0000 UTC m=+44.733976290" observedRunningTime="2026-04-21 01:50:05.94085992 +0000 UTC m=+44.889951216" watchObservedRunningTime="2026-04-21 01:50:05.942217093 +0000 UTC m=+44.891308389" Apr 21 01:50:09.934004 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:09.933954 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9rpdw" event={"ID":"ccbf8a05-33ed-4417-b280-79fda195d53d","Type":"ContainerStarted","Data":"11db2ddc020b3560293522ac78b63b354d2d9cbd23fe1f5724bde7c61c63e4fc"} Apr 21 01:50:09.947653 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:09.947607 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9rpdw" podStartSLOduration=1.860514489 podStartE2EDuration="6.947587126s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="2026-04-21 01:50:03.825666708 +0000 UTC m=+42.774757984" lastFinishedPulling="2026-04-21 01:50:08.912739346 +0000 UTC m=+47.861830621" observedRunningTime="2026-04-21 01:50:09.947317852 +0000 UTC m=+48.896409149" watchObservedRunningTime="2026-04-21 01:50:09.947587126 +0000 UTC m=+48.896678422" Apr 21 01:50:10.748284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:10.748243 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:50:10.748483 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:10.748306 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:50:10.748483 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:10.748411 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:10.748597 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:10.748489 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert podName:70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc nodeName:}" failed. No retries permitted until 2026-04-21 01:50:26.748469406 +0000 UTC m=+65.697560694 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert") pod "ingress-canary-r2p22" (UID: "70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc") : secret "canary-serving-cert" not found Apr 21 01:50:10.748597 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:10.748418 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:50:10.748597 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:10.748516 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-766c6fff44-lht95: secret "image-registry-tls" not found Apr 21 01:50:10.748597 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:10.748557 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls podName:2a74ac67-9fe5-4614-91e8-e0d09c4332fb nodeName:}" failed. No retries permitted until 2026-04-21 01:50:26.748546062 +0000 UTC m=+65.697637337 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls") pod "image-registry-766c6fff44-lht95" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb") : secret "image-registry-tls" not found Apr 21 01:50:10.848788 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:10.848760 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:50:10.848931 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:10.848893 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:10.848997 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:10.848946 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls podName:1bdc049e-4435-4a8a-a927-c21e9eb190a6 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:26.848932426 +0000 UTC m=+65.798023699 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls") pod "dns-default-hwqnv" (UID: "1bdc049e-4435-4a8a-a927-c21e9eb190a6") : secret "dns-default-metrics-tls" not found Apr 21 01:50:19.873439 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:19.873410 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d79xd" Apr 21 01:50:26.768192 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:26.768154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:50:26.768609 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:26.768213 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:50:26.768609 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:26.768301 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:26.768609 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:26.768314 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:50:26.768609 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:26.768335 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-766c6fff44-lht95: secret "image-registry-tls" not found Apr 21 01:50:26.768609 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:26.768370 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert podName:70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc nodeName:}" failed. No retries permitted until 2026-04-21 01:50:58.768354681 +0000 UTC m=+97.717445954 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert") pod "ingress-canary-r2p22" (UID: "70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc") : secret "canary-serving-cert" not found Apr 21 01:50:26.768609 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:26.768386 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls podName:2a74ac67-9fe5-4614-91e8-e0d09c4332fb nodeName:}" failed. No retries permitted until 2026-04-21 01:50:58.768378199 +0000 UTC m=+97.717469472 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls") pod "image-registry-766c6fff44-lht95" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb") : secret "image-registry-tls" not found Apr 21 01:50:26.868744 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:26.868709 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:50:26.868885 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:26.868866 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:26.868995 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:26.868936 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls podName:1bdc049e-4435-4a8a-a927-c21e9eb190a6 nodeName:}" failed. No retries permitted until 2026-04-21 01:50:58.868916895 +0000 UTC m=+97.818008176 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls") pod "dns-default-hwqnv" (UID: "1bdc049e-4435-4a8a-a927-c21e9eb190a6") : secret "dns-default-metrics-tls" not found Apr 21 01:50:27.372539 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:27.372500 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:50:27.375299 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:27.375281 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 01:50:27.383347 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:27.383331 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 01:50:27.383421 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:27.383380 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs podName:202b33ea-296c-464b-8ee5-774d048859c2 nodeName:}" failed. No retries permitted until 2026-04-21 01:51:31.383364625 +0000 UTC m=+130.332455898 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs") pod "network-metrics-daemon-lwrh2" (UID: "202b33ea-296c-464b-8ee5-774d048859c2") : secret "metrics-daemon-secret" not found Apr 21 01:50:27.473305 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:27.473270 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nvvh\" (UniqueName: \"kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh\") pod \"network-check-target-tn9c4\" (UID: \"a1f54627-ed53-4b78-85cb-f07dd7bc3869\") " pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:50:27.475957 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:27.475936 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 01:50:27.486234 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:27.486214 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 01:50:27.497531 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:27.497503 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nvvh\" (UniqueName: \"kubernetes.io/projected/a1f54627-ed53-4b78-85cb-f07dd7bc3869-kube-api-access-8nvvh\") pod \"network-check-target-tn9c4\" (UID: \"a1f54627-ed53-4b78-85cb-f07dd7bc3869\") " pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:50:27.630766 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:27.630668 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nb98s\"" Apr 21 01:50:27.637212 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:27.637177 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:50:27.753750 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:27.753719 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tn9c4"] Apr 21 01:50:27.756832 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:50:27.756804 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1f54627_ed53_4b78_85cb_f07dd7bc3869.slice/crio-12f024bfc7bc47cd1317e48e24d127c9db2984f43965595214b45ee807ea892b WatchSource:0}: Error finding container 12f024bfc7bc47cd1317e48e24d127c9db2984f43965595214b45ee807ea892b: Status 404 returned error can't find the container with id 12f024bfc7bc47cd1317e48e24d127c9db2984f43965595214b45ee807ea892b Apr 21 01:50:27.980444 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:27.980411 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tn9c4" event={"ID":"a1f54627-ed53-4b78-85cb-f07dd7bc3869","Type":"ContainerStarted","Data":"12f024bfc7bc47cd1317e48e24d127c9db2984f43965595214b45ee807ea892b"} Apr 21 01:50:31.991464 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:31.991432 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tn9c4" event={"ID":"a1f54627-ed53-4b78-85cb-f07dd7bc3869","Type":"ContainerStarted","Data":"79651066d5728440228c5d82eacc4f18ff338e91a8857f4a5d3f0abbfe6a4015"} Apr 21 01:50:31.991918 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:31.991550 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:50:32.005616 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:32.005571 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-tn9c4" podStartSLOduration=67.828076389 podStartE2EDuration="1m11.005557877s" podCreationTimestamp="2026-04-21 01:49:21 +0000 UTC" firstStartedPulling="2026-04-21 01:50:27.758769454 +0000 UTC m=+66.707860727" lastFinishedPulling="2026-04-21 01:50:30.936250936 +0000 UTC m=+69.885342215" observedRunningTime="2026-04-21 01:50:32.005246354 +0000 UTC m=+70.954337648" watchObservedRunningTime="2026-04-21 01:50:32.005557877 +0000 UTC m=+70.954649173" Apr 21 01:50:58.809289 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:58.809253 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:50:58.809775 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:58.809306 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:50:58.809775 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:58.809408 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:50:58.809775 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:58.809420 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-766c6fff44-lht95: secret "image-registry-tls" not found Apr 21 01:50:58.809775 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:58.809424 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:58.809775 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:58.809477 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls podName:2a74ac67-9fe5-4614-91e8-e0d09c4332fb nodeName:}" failed. No retries permitted until 2026-04-21 01:52:02.809462865 +0000 UTC m=+161.758554138 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls") pod "image-registry-766c6fff44-lht95" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb") : secret "image-registry-tls" not found Apr 21 01:50:58.809775 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:58.809500 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert podName:70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc nodeName:}" failed. No retries permitted until 2026-04-21 01:52:02.809482971 +0000 UTC m=+161.758574244 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert") pod "ingress-canary-r2p22" (UID: "70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc") : secret "canary-serving-cert" not found Apr 21 01:50:58.909780 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:50:58.909749 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:50:58.909909 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:58.909892 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:58.909965 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:50:58.909955 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls podName:1bdc049e-4435-4a8a-a927-c21e9eb190a6 nodeName:}" failed. No retries permitted until 2026-04-21 01:52:02.909937656 +0000 UTC m=+161.859028930 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls") pod "dns-default-hwqnv" (UID: "1bdc049e-4435-4a8a-a927-c21e9eb190a6") : secret "dns-default-metrics-tls" not found Apr 21 01:51:02.996499 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:51:02.996471 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-tn9c4" Apr 21 01:51:31.443223 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:51:31.443164 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:51:31.443716 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:51:31.443306 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 01:51:31.443716 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:51:31.443383 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs podName:202b33ea-296c-464b-8ee5-774d048859c2 nodeName:}" failed. No retries permitted until 2026-04-21 01:53:33.443365185 +0000 UTC m=+252.392456458 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs") pod "network-metrics-daemon-lwrh2" (UID: "202b33ea-296c-464b-8ee5-774d048859c2") : secret "metrics-daemon-secret" not found Apr 21 01:51:41.376038 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:51:41.376010 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ndnpw_a45eec20-9193-41a2-a65b-20a0623f23d5/dns-node-resolver/0.log" Apr 21 01:51:42.575312 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:51:42.575284 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wv5c7_69b70ba3-d144-4882-86ae-cb9a2e4391a4/node-ca/0.log" Apr 21 01:51:57.916119 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:51:57.916077 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-766c6fff44-lht95" podUID="2a74ac67-9fe5-4614-91e8-e0d09c4332fb" Apr 21 01:51:57.948458 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:51:57.948428 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-r2p22" podUID="70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc" Apr 21 01:51:58.014891 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:51:58.014845 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-hwqnv" podUID="1bdc049e-4435-4a8a-a927-c21e9eb190a6" Apr 21 01:51:58.195176 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:51:58.195110 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hwqnv" Apr 21 01:51:58.198527 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:51:58.195804 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:51:58.198527 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:51:58.195896 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:51:59.722806 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:51:59.722771 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-lwrh2" podUID="202b33ea-296c-464b-8ee5-774d048859c2" Apr 21 01:52:00.619927 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.619892 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-j5d6m"] Apr 21 01:52:00.623177 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.623154 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.626679 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.626659 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 01:52:00.626798 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.626694 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 01:52:00.627070 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.627051 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 01:52:00.627194 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.626967 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-q47kz\"" Apr 21 01:52:00.627267 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.627050 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 01:52:00.632495 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.632477 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j5d6m"] Apr 21 01:52:00.750147 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.750120 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/413a0815-b025-4937-bb22-c070bfb35757-data-volume\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.750566 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.750157 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/413a0815-b025-4937-bb22-c070bfb35757-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.750566 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.750179 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4nrr\" (UniqueName: \"kubernetes.io/projected/413a0815-b025-4937-bb22-c070bfb35757-kube-api-access-t4nrr\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.750566 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.750279 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/413a0815-b025-4937-bb22-c070bfb35757-crio-socket\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.750566 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.750329 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/413a0815-b025-4937-bb22-c070bfb35757-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.851078 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.851047 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/413a0815-b025-4937-bb22-c070bfb35757-data-volume\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.851255 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.851086 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/413a0815-b025-4937-bb22-c070bfb35757-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.851255 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.851107 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4nrr\" (UniqueName: \"kubernetes.io/projected/413a0815-b025-4937-bb22-c070bfb35757-kube-api-access-t4nrr\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.851255 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.851139 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/413a0815-b025-4937-bb22-c070bfb35757-crio-socket\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.851255 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.851155 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/413a0815-b025-4937-bb22-c070bfb35757-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.851255 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.851223 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/413a0815-b025-4937-bb22-c070bfb35757-crio-socket\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.851505 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.851479 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/413a0815-b025-4937-bb22-c070bfb35757-data-volume\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.851666 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.851648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/413a0815-b025-4937-bb22-c070bfb35757-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.853602 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.853581 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/413a0815-b025-4937-bb22-c070bfb35757-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.860942 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.860924 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4nrr\" (UniqueName: \"kubernetes.io/projected/413a0815-b025-4937-bb22-c070bfb35757-kube-api-access-t4nrr\") pod \"insights-runtime-extractor-j5d6m\" (UID: \"413a0815-b025-4937-bb22-c070bfb35757\") " pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:00.933028 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:00.932940 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j5d6m" Apr 21 01:52:01.049293 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:01.049263 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j5d6m"] Apr 21 01:52:01.052540 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:52:01.052506 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod413a0815_b025_4937_bb22_c070bfb35757.slice/crio-a18119b8c42df101e3205593049636bf3f168b3680aadb6822a9e3ee8f899cd8 WatchSource:0}: Error finding container a18119b8c42df101e3205593049636bf3f168b3680aadb6822a9e3ee8f899cd8: Status 404 returned error can't find the container with id a18119b8c42df101e3205593049636bf3f168b3680aadb6822a9e3ee8f899cd8 Apr 21 01:52:01.202394 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:01.202314 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j5d6m" event={"ID":"413a0815-b025-4937-bb22-c070bfb35757","Type":"ContainerStarted","Data":"03b100fb3e79b84df5f58f4bffc65289c1eaee913d36759ca84178520dbaf857"} Apr 21 01:52:01.202394 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:01.202352 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j5d6m" event={"ID":"413a0815-b025-4937-bb22-c070bfb35757","Type":"ContainerStarted","Data":"a18119b8c42df101e3205593049636bf3f168b3680aadb6822a9e3ee8f899cd8"} Apr 21 01:52:02.205891 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.205858 2579 generic.go:358] "Generic (PLEG): container finished" podID="1b23055c-1b97-4d29-9da2-753ff124fcee" containerID="f524b605cef6a8a839d8f1ff67b4b8ace555714f64d60cf603fcb316a356b053" exitCode=255 Apr 21 01:52:02.206354 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.205947 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" event={"ID":"1b23055c-1b97-4d29-9da2-753ff124fcee","Type":"ContainerDied","Data":"f524b605cef6a8a839d8f1ff67b4b8ace555714f64d60cf603fcb316a356b053"} Apr 21 01:52:02.206354 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.206316 2579 scope.go:117] "RemoveContainer" containerID="f524b605cef6a8a839d8f1ff67b4b8ace555714f64d60cf603fcb316a356b053" Apr 21 01:52:02.207885 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.207849 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j5d6m" event={"ID":"413a0815-b025-4937-bb22-c070bfb35757","Type":"ContainerStarted","Data":"4f1a4970d930b1cae9a4218eaa9fc138199dd5c3258e71b91c6111f73c7b5b74"} Apr 21 01:52:02.209370 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.209280 2579 generic.go:358] "Generic (PLEG): container finished" podID="6b2b2252-03b2-40ad-97d4-ed7404fbbf07" containerID="c983fd338d3833e6457a4388a7ddb728788de289fbb0fdfe61f0cbe3e0dd6e58" exitCode=1 Apr 21 01:52:02.209370 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.209352 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" event={"ID":"6b2b2252-03b2-40ad-97d4-ed7404fbbf07","Type":"ContainerDied","Data":"c983fd338d3833e6457a4388a7ddb728788de289fbb0fdfe61f0cbe3e0dd6e58"} Apr 21 01:52:02.209674 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.209652 2579 scope.go:117] "RemoveContainer" containerID="c983fd338d3833e6457a4388a7ddb728788de289fbb0fdfe61f0cbe3e0dd6e58" Apr 21 01:52:02.868147 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.868114 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:52:02.868360 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.868166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:52:02.870906 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.870883 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") pod \"image-registry-766c6fff44-lht95\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:52:02.871061 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.870943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc-cert\") pod \"ingress-canary-r2p22\" (UID: \"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc\") " pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:52:02.911789 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.911753 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:52:02.969297 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.969263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:52:02.971581 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.971561 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bdc049e-4435-4a8a-a927-c21e9eb190a6-metrics-tls\") pod \"dns-default-hwqnv\" (UID: \"1bdc049e-4435-4a8a-a927-c21e9eb190a6\") " pod="openshift-dns/dns-default-hwqnv" Apr 21 01:52:03.000001 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.999958 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vd9s5\"" Apr 21 01:52:03.000001 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.999967 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hdqjq\"" Apr 21 01:52:03.000167 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:02.999958 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbqdn\"" Apr 21 01:52:03.007106 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.007089 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hwqnv" Apr 21 01:52:03.007185 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.007106 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r2p22" Apr 21 01:52:03.007240 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.007222 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:52:03.192160 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.192128 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-766c6fff44-lht95"] Apr 21 01:52:03.196233 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:52:03.196202 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a74ac67_9fe5_4614_91e8_e0d09c4332fb.slice/crio-43c9c1b0c2704eb0382c5561bbb2ed01419d917190c4501c575d8e18faee0c5d WatchSource:0}: Error finding container 43c9c1b0c2704eb0382c5561bbb2ed01419d917190c4501c575d8e18faee0c5d: Status 404 returned error can't find the container with id 43c9c1b0c2704eb0382c5561bbb2ed01419d917190c4501c575d8e18faee0c5d Apr 21 01:52:03.213073 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.213043 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" event={"ID":"6b2b2252-03b2-40ad-97d4-ed7404fbbf07","Type":"ContainerStarted","Data":"6ac728b8143f4447a6c53605c2de5a7f74919aab1fca9b9f92ec2e0814ee7de3"} Apr 21 01:52:03.213612 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.213390 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:52:03.214116 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.214095 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-684679d89d-ctrk5" Apr 21 01:52:03.214599 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.214572 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-766c6fff44-lht95" event={"ID":"2a74ac67-9fe5-4614-91e8-e0d09c4332fb","Type":"ContainerStarted","Data":"43c9c1b0c2704eb0382c5561bbb2ed01419d917190c4501c575d8e18faee0c5d"} Apr 21 01:52:03.216026 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.216008 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc48f48b9-kh82z" event={"ID":"1b23055c-1b97-4d29-9da2-753ff124fcee","Type":"ContainerStarted","Data":"982766bab37d399cfc7d5aeee30a2f4a4355e2a0eb09a415f35dabca5781f8e7"} Apr 21 01:52:03.217541 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.217522 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j5d6m" event={"ID":"413a0815-b025-4937-bb22-c070bfb35757","Type":"ContainerStarted","Data":"9c999efe54d204fa9960d8d3fb257b46ebee3a24a147d17f0849d6188a1463f3"} Apr 21 01:52:03.243692 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.243650 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-j5d6m" podStartSLOduration=1.361604598 podStartE2EDuration="3.243634441s" podCreationTimestamp="2026-04-21 01:52:00 +0000 UTC" firstStartedPulling="2026-04-21 01:52:01.107458859 +0000 UTC m=+160.056550137" lastFinishedPulling="2026-04-21 01:52:02.989488688 +0000 UTC m=+161.938579980" observedRunningTime="2026-04-21 01:52:03.242816482 +0000 UTC m=+162.191907791" watchObservedRunningTime="2026-04-21 01:52:03.243634441 +0000 UTC m=+162.192725739" Apr 21 01:52:03.366933 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.366891 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hwqnv"] Apr 21 01:52:03.367933 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:03.367911 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r2p22"] Apr 21 01:52:03.371549 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:52:03.371521 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bdc049e_4435_4a8a_a927_c21e9eb190a6.slice/crio-219fcf777b7d7132d9f8614c4c1bedfa5cb314e409a373e10b98523d5daba560 WatchSource:0}: Error finding container 219fcf777b7d7132d9f8614c4c1bedfa5cb314e409a373e10b98523d5daba560: Status 404 returned error can't find the container with id 219fcf777b7d7132d9f8614c4c1bedfa5cb314e409a373e10b98523d5daba560 Apr 21 01:52:03.372265 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:52:03.372184 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70ae3e7d_6300_4ae1_b1f2_a24f5fff0bfc.slice/crio-75ded13ca76c92f99651b6712121089254ca56b2a9c12e611e82e118a7561808 WatchSource:0}: Error finding container 75ded13ca76c92f99651b6712121089254ca56b2a9c12e611e82e118a7561808: Status 404 returned error can't find the container with id 75ded13ca76c92f99651b6712121089254ca56b2a9c12e611e82e118a7561808 Apr 21 01:52:04.221667 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:04.221624 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r2p22" event={"ID":"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc","Type":"ContainerStarted","Data":"75ded13ca76c92f99651b6712121089254ca56b2a9c12e611e82e118a7561808"} Apr 21 01:52:04.223316 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:04.223285 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwqnv" event={"ID":"1bdc049e-4435-4a8a-a927-c21e9eb190a6","Type":"ContainerStarted","Data":"219fcf777b7d7132d9f8614c4c1bedfa5cb314e409a373e10b98523d5daba560"} Apr 21 01:52:04.225565 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:04.225009 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-766c6fff44-lht95" event={"ID":"2a74ac67-9fe5-4614-91e8-e0d09c4332fb","Type":"ContainerStarted","Data":"75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4"} Apr 21 01:52:05.226919 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:05.226888 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:52:06.231657 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:06.231623 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r2p22" event={"ID":"70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc","Type":"ContainerStarted","Data":"4ed49a075da0ab9d27cd6b3c55c522fb52121a9a84ebb95da6fe80387fec6fa4"} Apr 21 01:52:06.233249 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:06.233223 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwqnv" event={"ID":"1bdc049e-4435-4a8a-a927-c21e9eb190a6","Type":"ContainerStarted","Data":"c2c7f408bc66ff951fdb371cb28b8db0ead13fc1e198b1f2166056defb28d692"} Apr 21 01:52:06.233249 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:06.233253 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hwqnv" event={"ID":"1bdc049e-4435-4a8a-a927-c21e9eb190a6","Type":"ContainerStarted","Data":"acfa0b856792722eacd72d285e3229e0eba8ab6157e9820e8d592747e0ed190f"} Apr 21 01:52:06.233429 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:06.233373 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hwqnv" Apr 21 01:52:06.248755 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:06.248715 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r2p22" podStartSLOduration=130.362652644 podStartE2EDuration="2m12.248704216s" podCreationTimestamp="2026-04-21 01:49:54 +0000 UTC" firstStartedPulling="2026-04-21 01:52:03.374091022 +0000 UTC m=+162.323182296" lastFinishedPulling="2026-04-21 01:52:05.26014259 +0000 UTC m=+164.209233868" observedRunningTime="2026-04-21 01:52:06.248283342 +0000 UTC m=+165.197374641" watchObservedRunningTime="2026-04-21 01:52:06.248704216 +0000 UTC m=+165.197795511" Apr 21 01:52:06.248923 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:06.248905 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-766c6fff44-lht95" podStartSLOduration=164.248900693 podStartE2EDuration="2m44.248900693s" podCreationTimestamp="2026-04-21 01:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:52:04.244368532 +0000 UTC m=+163.193459826" watchObservedRunningTime="2026-04-21 01:52:06.248900693 +0000 UTC m=+165.197991982" Apr 21 01:52:06.266529 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:06.266491 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hwqnv" podStartSLOduration=130.383403838 podStartE2EDuration="2m12.266478473s" podCreationTimestamp="2026-04-21 01:49:54 +0000 UTC" firstStartedPulling="2026-04-21 01:52:03.37353063 +0000 UTC m=+162.322621903" lastFinishedPulling="2026-04-21 01:52:05.256605265 +0000 UTC m=+164.205696538" observedRunningTime="2026-04-21 01:52:06.26562586 +0000 UTC m=+165.214717155" watchObservedRunningTime="2026-04-21 01:52:06.266478473 +0000 UTC m=+165.215569767" Apr 21 01:52:13.705299 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:13.705215 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:52:16.238651 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.238615 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hwqnv" Apr 21 01:52:16.719595 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.719519 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vfbfv"] Apr 21 01:52:16.723780 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.723762 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.726755 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.726730 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 01:52:16.726856 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.726740 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 01:52:16.727871 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.727848 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2rf2h\"" Apr 21 01:52:16.727952 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.727871 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 01:52:16.727952 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.727886 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 01:52:16.727952 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.727929 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 01:52:16.727952 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.727890 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 01:52:16.878765 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.878732 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0087979a-093e-49eb-8643-f323f304cd76-root\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.878765 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.878768 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-tls\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.878963 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.878785 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0087979a-093e-49eb-8643-f323f304cd76-metrics-client-ca\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.878963 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.878843 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.878963 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.878883 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-textfile\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.878963 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.878919 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-wtmp\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.878963 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.878933 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wlhn\" (UniqueName: \"kubernetes.io/projected/0087979a-093e-49eb-8643-f323f304cd76-kube-api-access-7wlhn\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.879147 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.879024 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-accelerators-collector-config\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.879147 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.879043 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0087979a-093e-49eb-8643-f323f304cd76-sys\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980266 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980228 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-accelerators-collector-config\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980266 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0087979a-093e-49eb-8643-f323f304cd76-sys\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980526 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980292 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0087979a-093e-49eb-8643-f323f304cd76-root\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980526 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980310 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-tls\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980526 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980332 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0087979a-093e-49eb-8643-f323f304cd76-metrics-client-ca\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980526 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980358 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980526 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980361 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0087979a-093e-49eb-8643-f323f304cd76-sys\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980526 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980409 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-textfile\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980526 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980361 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0087979a-093e-49eb-8643-f323f304cd76-root\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980526 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980470 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-wtmp\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980526 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980497 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wlhn\" (UniqueName: \"kubernetes.io/projected/0087979a-093e-49eb-8643-f323f304cd76-kube-api-access-7wlhn\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980941 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980622 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-wtmp\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980941 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980759 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-textfile\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.980941 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.980913 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0087979a-093e-49eb-8643-f323f304cd76-metrics-client-ca\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.981434 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.981413 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-accelerators-collector-config\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.982961 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.982934 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-tls\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.982961 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.982959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0087979a-093e-49eb-8643-f323f304cd76-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:16.988109 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:16.988091 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wlhn\" (UniqueName: \"kubernetes.io/projected/0087979a-093e-49eb-8643-f323f304cd76-kube-api-access-7wlhn\") pod \"node-exporter-vfbfv\" (UID: \"0087979a-093e-49eb-8643-f323f304cd76\") " pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:17.033505 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:17.033462 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vfbfv" Apr 21 01:52:17.043651 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:52:17.043617 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0087979a_093e_49eb_8643_f323f304cd76.slice/crio-5f33f8a108ca271cd052e060b6b2870d0edfab8e913663c1b21873c7f7d3e604 WatchSource:0}: Error finding container 5f33f8a108ca271cd052e060b6b2870d0edfab8e913663c1b21873c7f7d3e604: Status 404 returned error can't find the container with id 5f33f8a108ca271cd052e060b6b2870d0edfab8e913663c1b21873c7f7d3e604 Apr 21 01:52:17.262955 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:17.262871 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vfbfv" event={"ID":"0087979a-093e-49eb-8643-f323f304cd76","Type":"ContainerStarted","Data":"5f33f8a108ca271cd052e060b6b2870d0edfab8e913663c1b21873c7f7d3e604"} Apr 21 01:52:18.268645 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:18.268557 2579 generic.go:358] "Generic (PLEG): container finished" podID="0087979a-093e-49eb-8643-f323f304cd76" containerID="73ba3d5b8651b08a5bad7a7545a02d5acfbe711d13a122ac06689d4795f094ec" exitCode=0 Apr 21 01:52:18.268645 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:18.268616 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vfbfv" event={"ID":"0087979a-093e-49eb-8643-f323f304cd76","Type":"ContainerDied","Data":"73ba3d5b8651b08a5bad7a7545a02d5acfbe711d13a122ac06689d4795f094ec"} Apr 21 01:52:19.272799 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:19.272761 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vfbfv" event={"ID":"0087979a-093e-49eb-8643-f323f304cd76","Type":"ContainerStarted","Data":"58a73203b53b4247a2198d0711ef0d3103a26318193d1239eabcf62c82a3e612"} Apr 21 01:52:19.272799 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:19.272796 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vfbfv" event={"ID":"0087979a-093e-49eb-8643-f323f304cd76","Type":"ContainerStarted","Data":"c658dd738adba7d6957829cf58a22fa5e115b0a4ff6cc34694c32a56b92e04f7"} Apr 21 01:52:23.249891 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:23.249836 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vfbfv" podStartSLOduration=6.578852895 podStartE2EDuration="7.249820318s" podCreationTimestamp="2026-04-21 01:52:16 +0000 UTC" firstStartedPulling="2026-04-21 01:52:17.045607429 +0000 UTC m=+175.994698701" lastFinishedPulling="2026-04-21 01:52:17.716574848 +0000 UTC m=+176.665666124" observedRunningTime="2026-04-21 01:52:19.299728766 +0000 UTC m=+178.248820062" watchObservedRunningTime="2026-04-21 01:52:23.249820318 +0000 UTC m=+182.198911613" Apr 21 01:52:23.250748 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:23.250732 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-766c6fff44-lht95"] Apr 21 01:52:23.254652 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:23.254630 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:52:25.235961 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:25.235898 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" podUID="119fa989-6388-4b2a-bbb2-a93810ea36d8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 01:52:35.235691 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:35.235651 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" podUID="119fa989-6388-4b2a-bbb2-a93810ea36d8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 01:52:44.167424 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:44.167394 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r2p22_70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc/serve-healthcheck-canary/0.log" Apr 21 01:52:45.236682 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:45.236642 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" podUID="119fa989-6388-4b2a-bbb2-a93810ea36d8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 01:52:45.237159 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:45.236718 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" Apr 21 01:52:45.237250 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:45.237208 2579 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"ae57e84c35116137214d0599672f2098051ea369c6bd0738ed26bc7e5e55ddba"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 01:52:45.237298 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:45.237271 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" podUID="119fa989-6388-4b2a-bbb2-a93810ea36d8" containerName="service-proxy" containerID="cri-o://ae57e84c35116137214d0599672f2098051ea369c6bd0738ed26bc7e5e55ddba" gracePeriod=30 Apr 21 01:52:46.345595 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:46.345561 2579 generic.go:358] "Generic (PLEG): container finished" podID="119fa989-6388-4b2a-bbb2-a93810ea36d8" containerID="ae57e84c35116137214d0599672f2098051ea369c6bd0738ed26bc7e5e55ddba" exitCode=2 Apr 21 01:52:46.346034 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:46.345619 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" event={"ID":"119fa989-6388-4b2a-bbb2-a93810ea36d8","Type":"ContainerDied","Data":"ae57e84c35116137214d0599672f2098051ea369c6bd0738ed26bc7e5e55ddba"} Apr 21 01:52:46.346034 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:46.345645 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b564b976c-jrvh2" event={"ID":"119fa989-6388-4b2a-bbb2-a93810ea36d8","Type":"ContainerStarted","Data":"f496dae88c73b9846075a58d454cf2387310bd5bfa57259052e953d5c0bb6448"} Apr 21 01:52:48.269786 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.269742 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-766c6fff44-lht95" podUID="2a74ac67-9fe5-4614-91e8-e0d09c4332fb" containerName="registry" containerID="cri-o://75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4" gracePeriod=30 Apr 21 01:52:48.509104 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.509079 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:52:48.619282 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.619201 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-ca-trust-extracted\") pod \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " Apr 21 01:52:48.619282 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.619241 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-installation-pull-secrets\") pod \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " Apr 21 01:52:48.619282 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.619259 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjrz4\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-kube-api-access-gjrz4\") pod \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " Apr 21 01:52:48.619282 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.619279 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-bound-sa-token\") pod \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " Apr 21 01:52:48.619584 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.619396 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-certificates\") pod \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " Apr 21 01:52:48.619584 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.619430 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-trusted-ca\") pod \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " Apr 21 01:52:48.619584 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.619459 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") pod \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " Apr 21 01:52:48.619584 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.619546 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-image-registry-private-configuration\") pod \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\" (UID: \"2a74ac67-9fe5-4614-91e8-e0d09c4332fb\") " Apr 21 01:52:48.619924 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.619888 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2a74ac67-9fe5-4614-91e8-e0d09c4332fb" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:52:48.620289 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.620193 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2a74ac67-9fe5-4614-91e8-e0d09c4332fb" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:52:48.622313 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.622283 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2a74ac67-9fe5-4614-91e8-e0d09c4332fb" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:52:48.622425 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.622316 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2a74ac67-9fe5-4614-91e8-e0d09c4332fb" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:52:48.622425 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.622322 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2a74ac67-9fe5-4614-91e8-e0d09c4332fb" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:52:48.622657 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.622632 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2a74ac67-9fe5-4614-91e8-e0d09c4332fb" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:52:48.622657 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.622639 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-kube-api-access-gjrz4" (OuterVolumeSpecName: "kube-api-access-gjrz4") pod "2a74ac67-9fe5-4614-91e8-e0d09c4332fb" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb"). InnerVolumeSpecName "kube-api-access-gjrz4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:52:48.630559 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.630537 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2a74ac67-9fe5-4614-91e8-e0d09c4332fb" (UID: "2a74ac67-9fe5-4614-91e8-e0d09c4332fb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:52:48.720240 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.720207 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-ca-trust-extracted\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:52:48.720240 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.720235 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-installation-pull-secrets\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:52:48.720240 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.720244 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjrz4\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-kube-api-access-gjrz4\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:52:48.720433 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.720253 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-bound-sa-token\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:52:48.720433 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.720261 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-certificates\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:52:48.720433 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.720270 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-trusted-ca\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:52:48.720433 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.720278 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-registry-tls\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:52:48.720433 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:48.720287 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2a74ac67-9fe5-4614-91e8-e0d09c4332fb-image-registry-private-configuration\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:52:49.354478 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:49.354443 2579 generic.go:358] "Generic (PLEG): container finished" podID="2a74ac67-9fe5-4614-91e8-e0d09c4332fb" containerID="75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4" exitCode=0 Apr 21 01:52:49.354890 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:49.354502 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-766c6fff44-lht95" Apr 21 01:52:49.354890 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:49.354525 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-766c6fff44-lht95" event={"ID":"2a74ac67-9fe5-4614-91e8-e0d09c4332fb","Type":"ContainerDied","Data":"75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4"} Apr 21 01:52:49.354890 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:49.354557 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-766c6fff44-lht95" event={"ID":"2a74ac67-9fe5-4614-91e8-e0d09c4332fb","Type":"ContainerDied","Data":"43c9c1b0c2704eb0382c5561bbb2ed01419d917190c4501c575d8e18faee0c5d"} Apr 21 01:52:49.354890 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:49.354570 2579 scope.go:117] "RemoveContainer" containerID="75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4" Apr 21 01:52:49.362548 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:49.362531 2579 scope.go:117] "RemoveContainer" containerID="75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4" Apr 21 01:52:49.362773 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:52:49.362754 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4\": container with ID starting with 75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4 not found: ID does not exist" containerID="75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4" Apr 21 01:52:49.362841 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:49.362785 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4"} err="failed to get container status \"75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4\": rpc error: code = NotFound desc = could not find container \"75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4\": container with ID starting with 75e944ec15a1cf89137030a1bbc39d26ed0fad0dc40dee1093dc6fdc62476cc4 not found: ID does not exist" Apr 21 01:52:49.374020 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:49.373987 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-766c6fff44-lht95"] Apr 21 01:52:49.377356 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:49.377337 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-766c6fff44-lht95"] Apr 21 01:52:49.708875 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:52:49.708784 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a74ac67-9fe5-4614-91e8-e0d09c4332fb" path="/var/lib/kubelet/pods/2a74ac67-9fe5-4614-91e8-e0d09c4332fb/volumes" Apr 21 01:53:33.526462 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:53:33.526422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:53:33.528934 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:53:33.528910 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/202b33ea-296c-464b-8ee5-774d048859c2-metrics-certs\") pod \"network-metrics-daemon-lwrh2\" (UID: \"202b33ea-296c-464b-8ee5-774d048859c2\") " pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:53:33.809403 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:53:33.809322 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-68xr4\"" Apr 21 01:53:33.816392 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:53:33.816374 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lwrh2" Apr 21 01:53:33.932500 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:53:33.932471 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lwrh2"] Apr 21 01:53:33.935887 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:53:33.935856 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod202b33ea_296c_464b_8ee5_774d048859c2.slice/crio-c3cf00e88830c7ecdd6a0f99081b67c706ddeef204751c7a5f96898c949dc210 WatchSource:0}: Error finding container c3cf00e88830c7ecdd6a0f99081b67c706ddeef204751c7a5f96898c949dc210: Status 404 returned error can't find the container with id c3cf00e88830c7ecdd6a0f99081b67c706ddeef204751c7a5f96898c949dc210 Apr 21 01:53:34.474619 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:53:34.474570 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lwrh2" event={"ID":"202b33ea-296c-464b-8ee5-774d048859c2","Type":"ContainerStarted","Data":"c3cf00e88830c7ecdd6a0f99081b67c706ddeef204751c7a5f96898c949dc210"} Apr 21 01:53:35.480820 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:53:35.480777 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lwrh2" event={"ID":"202b33ea-296c-464b-8ee5-774d048859c2","Type":"ContainerStarted","Data":"61a9359ba1348117d0970883b4858ac7d63cc37b7920f17fb8c951170d351c3a"} Apr 21 01:53:35.480820 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:53:35.480817 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lwrh2" event={"ID":"202b33ea-296c-464b-8ee5-774d048859c2","Type":"ContainerStarted","Data":"d87fec0ab3c359a1c30a0a902b828624c9f65c5430afc5db72bfba791a404580"} Apr 21 01:53:35.495369 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:53:35.495324 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lwrh2" podStartSLOduration=253.637963039 podStartE2EDuration="4m14.495307323s" podCreationTimestamp="2026-04-21 01:49:21 +0000 UTC" firstStartedPulling="2026-04-21 01:53:33.937640488 +0000 UTC m=+252.886731763" lastFinishedPulling="2026-04-21 01:53:34.794984759 +0000 UTC m=+253.744076047" observedRunningTime="2026-04-21 01:53:35.494600405 +0000 UTC m=+254.443691700" watchObservedRunningTime="2026-04-21 01:53:35.495307323 +0000 UTC m=+254.444398617" Apr 21 01:54:21.562681 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:54:21.562652 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-acl-logging/0.log" Apr 21 01:54:21.563133 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:54:21.562724 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-acl-logging/0.log" Apr 21 01:54:21.568730 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:54:21.568710 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 01:55:37.478515 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.478431 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-669wg"] Apr 21 01:55:37.478962 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.478672 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a74ac67-9fe5-4614-91e8-e0d09c4332fb" containerName="registry" Apr 21 01:55:37.478962 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.478682 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a74ac67-9fe5-4614-91e8-e0d09c4332fb" containerName="registry" Apr 21 01:55:37.478962 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.478720 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a74ac67-9fe5-4614-91e8-e0d09c4332fb" containerName="registry" Apr 21 01:55:37.481508 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.481492 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-669wg" Apr 21 01:55:37.484051 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.484027 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 01:55:37.485169 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.485152 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-swvgh\"" Apr 21 01:55:37.485243 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.485194 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 01:55:37.490633 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.490613 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-669wg"] Apr 21 01:55:37.550056 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.550021 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/220f5d7e-8663-4244-ae31-300aada198f5-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-669wg\" (UID: \"220f5d7e-8663-4244-ae31-300aada198f5\") " pod="cert-manager/cert-manager-cainjector-68b757865b-669wg" Apr 21 01:55:37.550207 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.550083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snzcl\" (UniqueName: \"kubernetes.io/projected/220f5d7e-8663-4244-ae31-300aada198f5-kube-api-access-snzcl\") pod \"cert-manager-cainjector-68b757865b-669wg\" (UID: \"220f5d7e-8663-4244-ae31-300aada198f5\") " pod="cert-manager/cert-manager-cainjector-68b757865b-669wg" Apr 21 01:55:37.651336 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.651293 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/220f5d7e-8663-4244-ae31-300aada198f5-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-669wg\" (UID: \"220f5d7e-8663-4244-ae31-300aada198f5\") " pod="cert-manager/cert-manager-cainjector-68b757865b-669wg" Apr 21 01:55:37.651453 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.651352 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snzcl\" (UniqueName: \"kubernetes.io/projected/220f5d7e-8663-4244-ae31-300aada198f5-kube-api-access-snzcl\") pod \"cert-manager-cainjector-68b757865b-669wg\" (UID: \"220f5d7e-8663-4244-ae31-300aada198f5\") " pod="cert-manager/cert-manager-cainjector-68b757865b-669wg" Apr 21 01:55:37.659233 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.659207 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/220f5d7e-8663-4244-ae31-300aada198f5-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-669wg\" (UID: \"220f5d7e-8663-4244-ae31-300aada198f5\") " pod="cert-manager/cert-manager-cainjector-68b757865b-669wg" Apr 21 01:55:37.659350 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.659335 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snzcl\" (UniqueName: \"kubernetes.io/projected/220f5d7e-8663-4244-ae31-300aada198f5-kube-api-access-snzcl\") pod \"cert-manager-cainjector-68b757865b-669wg\" (UID: \"220f5d7e-8663-4244-ae31-300aada198f5\") " pod="cert-manager/cert-manager-cainjector-68b757865b-669wg" Apr 21 01:55:37.790670 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.790580 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-669wg" Apr 21 01:55:37.907829 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.907799 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-669wg"] Apr 21 01:55:37.910774 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:55:37.910744 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod220f5d7e_8663_4244_ae31_300aada198f5.slice/crio-51275b7e2344abae77eda1a217d62d5d6098d2953fce5b24d5cf57ed789dc6a5 WatchSource:0}: Error finding container 51275b7e2344abae77eda1a217d62d5d6098d2953fce5b24d5cf57ed789dc6a5: Status 404 returned error can't find the container with id 51275b7e2344abae77eda1a217d62d5d6098d2953fce5b24d5cf57ed789dc6a5 Apr 21 01:55:37.912491 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:37.912474 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 01:55:38.798930 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:38.798896 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-669wg" event={"ID":"220f5d7e-8663-4244-ae31-300aada198f5","Type":"ContainerStarted","Data":"51275b7e2344abae77eda1a217d62d5d6098d2953fce5b24d5cf57ed789dc6a5"} Apr 21 01:55:40.806287 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:40.806255 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-669wg" event={"ID":"220f5d7e-8663-4244-ae31-300aada198f5","Type":"ContainerStarted","Data":"e42b5bd7af3ffff231d5ba57a57a082b21c23390c51e8dbabf6e634d5c4cde33"} Apr 21 01:55:40.822216 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:55:40.822174 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-669wg" podStartSLOduration=1.048114093 podStartE2EDuration="3.822160848s" podCreationTimestamp="2026-04-21 01:55:37 +0000 UTC" firstStartedPulling="2026-04-21 01:55:37.912601998 +0000 UTC m=+376.861693271" lastFinishedPulling="2026-04-21 01:55:40.68664875 +0000 UTC m=+379.635740026" observedRunningTime="2026-04-21 01:55:40.821085894 +0000 UTC m=+379.770177201" watchObservedRunningTime="2026-04-21 01:55:40.822160848 +0000 UTC m=+379.771252143" Apr 21 01:56:12.285658 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.285627 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w"] Apr 21 01:56:12.288740 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.288720 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:12.291620 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.291592 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 01:56:12.291745 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.291620 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 01:56:12.291822 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.291805 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 01:56:12.291935 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.291909 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jjk2t\"" Apr 21 01:56:12.292064 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.291912 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 01:56:12.326044 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.326014 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w"] Apr 21 01:56:12.389342 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.389312 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6e215592-89d0-4593-9c7d-edec627230a9-webhook-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-92n2w\" (UID: \"6e215592-89d0-4593-9c7d-edec627230a9\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:12.389506 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.389373 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6e215592-89d0-4593-9c7d-edec627230a9-apiservice-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-92n2w\" (UID: \"6e215592-89d0-4593-9c7d-edec627230a9\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:12.389506 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.389451 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvl8c\" (UniqueName: \"kubernetes.io/projected/6e215592-89d0-4593-9c7d-edec627230a9-kube-api-access-xvl8c\") pod \"opendatahub-operator-controller-manager-64bbc69db5-92n2w\" (UID: \"6e215592-89d0-4593-9c7d-edec627230a9\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:12.489870 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.489829 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvl8c\" (UniqueName: \"kubernetes.io/projected/6e215592-89d0-4593-9c7d-edec627230a9-kube-api-access-xvl8c\") pod \"opendatahub-operator-controller-manager-64bbc69db5-92n2w\" (UID: \"6e215592-89d0-4593-9c7d-edec627230a9\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:12.490048 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.489878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6e215592-89d0-4593-9c7d-edec627230a9-webhook-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-92n2w\" (UID: \"6e215592-89d0-4593-9c7d-edec627230a9\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:12.490048 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.489926 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6e215592-89d0-4593-9c7d-edec627230a9-apiservice-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-92n2w\" (UID: \"6e215592-89d0-4593-9c7d-edec627230a9\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:12.492501 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.492477 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6e215592-89d0-4593-9c7d-edec627230a9-apiservice-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-92n2w\" (UID: \"6e215592-89d0-4593-9c7d-edec627230a9\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:12.492610 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.492507 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6e215592-89d0-4593-9c7d-edec627230a9-webhook-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-92n2w\" (UID: \"6e215592-89d0-4593-9c7d-edec627230a9\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:12.497783 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.497762 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvl8c\" (UniqueName: \"kubernetes.io/projected/6e215592-89d0-4593-9c7d-edec627230a9-kube-api-access-xvl8c\") pod \"opendatahub-operator-controller-manager-64bbc69db5-92n2w\" (UID: \"6e215592-89d0-4593-9c7d-edec627230a9\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:12.598327 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.598251 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:12.719312 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.719288 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w"] Apr 21 01:56:12.721808 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:56:12.721780 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e215592_89d0_4593_9c7d_edec627230a9.slice/crio-856fddd3029c79716c262f02ddb9e9a59b7a4d0dcd2cc1b940e32c4190c32e33 WatchSource:0}: Error finding container 856fddd3029c79716c262f02ddb9e9a59b7a4d0dcd2cc1b940e32c4190c32e33: Status 404 returned error can't find the container with id 856fddd3029c79716c262f02ddb9e9a59b7a4d0dcd2cc1b940e32c4190c32e33 Apr 21 01:56:12.886857 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:12.886779 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" event={"ID":"6e215592-89d0-4593-9c7d-edec627230a9","Type":"ContainerStarted","Data":"856fddd3029c79716c262f02ddb9e9a59b7a4d0dcd2cc1b940e32c4190c32e33"} Apr 21 01:56:15.896467 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:15.896430 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" event={"ID":"6e215592-89d0-4593-9c7d-edec627230a9","Type":"ContainerStarted","Data":"93b8a02c2de046f73a5c1256ee93ad7e3726e41863306ee6d5542e98dc6ae6fa"} Apr 21 01:56:15.896935 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:15.896550 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:15.916309 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:15.916267 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" podStartSLOduration=1.388399503 podStartE2EDuration="3.91625459s" podCreationTimestamp="2026-04-21 01:56:12 +0000 UTC" firstStartedPulling="2026-04-21 01:56:12.723573091 +0000 UTC m=+411.672664364" lastFinishedPulling="2026-04-21 01:56:15.251428172 +0000 UTC m=+414.200519451" observedRunningTime="2026-04-21 01:56:15.91399467 +0000 UTC m=+414.863085959" watchObservedRunningTime="2026-04-21 01:56:15.91625459 +0000 UTC m=+414.865345885" Apr 21 01:56:26.901598 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:26.901570 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-92n2w" Apr 21 01:56:30.579342 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.579303 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-56c874cd68-rzt28"] Apr 21 01:56:30.582623 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.582603 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" Apr 21 01:56:30.585622 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.585603 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 01:56:30.585745 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.585727 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 01:56:30.586825 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.586808 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 01:56:30.586921 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.586842 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 01:56:30.587044 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.587029 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-8qshb\"" Apr 21 01:56:30.592471 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.592450 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-56c874cd68-rzt28"] Apr 21 01:56:30.722409 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.722377 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh42r\" (UniqueName: \"kubernetes.io/projected/fda4a570-8f0b-47fa-8cdd-5dfbc383110d-kube-api-access-mh42r\") pod \"kube-auth-proxy-56c874cd68-rzt28\" (UID: \"fda4a570-8f0b-47fa-8cdd-5dfbc383110d\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" Apr 21 01:56:30.722602 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.722434 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fda4a570-8f0b-47fa-8cdd-5dfbc383110d-tls-certs\") pod \"kube-auth-proxy-56c874cd68-rzt28\" (UID: \"fda4a570-8f0b-47fa-8cdd-5dfbc383110d\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" Apr 21 01:56:30.722602 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.722463 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fda4a570-8f0b-47fa-8cdd-5dfbc383110d-tmp\") pod \"kube-auth-proxy-56c874cd68-rzt28\" (UID: \"fda4a570-8f0b-47fa-8cdd-5dfbc383110d\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" Apr 21 01:56:30.823456 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.823399 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fda4a570-8f0b-47fa-8cdd-5dfbc383110d-tls-certs\") pod \"kube-auth-proxy-56c874cd68-rzt28\" (UID: \"fda4a570-8f0b-47fa-8cdd-5dfbc383110d\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" Apr 21 01:56:30.823456 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.823450 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fda4a570-8f0b-47fa-8cdd-5dfbc383110d-tmp\") pod \"kube-auth-proxy-56c874cd68-rzt28\" (UID: \"fda4a570-8f0b-47fa-8cdd-5dfbc383110d\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" Apr 21 01:56:30.823733 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.823477 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mh42r\" (UniqueName: \"kubernetes.io/projected/fda4a570-8f0b-47fa-8cdd-5dfbc383110d-kube-api-access-mh42r\") pod \"kube-auth-proxy-56c874cd68-rzt28\" (UID: \"fda4a570-8f0b-47fa-8cdd-5dfbc383110d\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" Apr 21 01:56:30.826251 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.826225 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fda4a570-8f0b-47fa-8cdd-5dfbc383110d-tmp\") pod \"kube-auth-proxy-56c874cd68-rzt28\" (UID: \"fda4a570-8f0b-47fa-8cdd-5dfbc383110d\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" Apr 21 01:56:30.826251 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.826242 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fda4a570-8f0b-47fa-8cdd-5dfbc383110d-tls-certs\") pod \"kube-auth-proxy-56c874cd68-rzt28\" (UID: \"fda4a570-8f0b-47fa-8cdd-5dfbc383110d\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" Apr 21 01:56:30.832188 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.832135 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh42r\" (UniqueName: \"kubernetes.io/projected/fda4a570-8f0b-47fa-8cdd-5dfbc383110d-kube-api-access-mh42r\") pod \"kube-auth-proxy-56c874cd68-rzt28\" (UID: \"fda4a570-8f0b-47fa-8cdd-5dfbc383110d\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" Apr 21 01:56:30.892108 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:30.892087 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" Apr 21 01:56:31.011926 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:31.011893 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-56c874cd68-rzt28"] Apr 21 01:56:31.015394 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:56:31.015357 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfda4a570_8f0b_47fa_8cdd_5dfbc383110d.slice/crio-d8e7091f73837f2074200c2f3cece31d8c4377125b68f57355bf24d92ce4c7a5 WatchSource:0}: Error finding container d8e7091f73837f2074200c2f3cece31d8c4377125b68f57355bf24d92ce4c7a5: Status 404 returned error can't find the container with id d8e7091f73837f2074200c2f3cece31d8c4377125b68f57355bf24d92ce4c7a5 Apr 21 01:56:31.941157 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:31.941119 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" event={"ID":"fda4a570-8f0b-47fa-8cdd-5dfbc383110d","Type":"ContainerStarted","Data":"d8e7091f73837f2074200c2f3cece31d8c4377125b68f57355bf24d92ce4c7a5"} Apr 21 01:56:33.962229 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:33.962191 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-rlcxz"] Apr 21 01:56:33.965224 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:33.965202 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" Apr 21 01:56:33.967865 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:33.967845 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-xbsp2\"" Apr 21 01:56:33.967963 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:33.967894 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 21 01:56:33.975143 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:33.975114 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-rlcxz"] Apr 21 01:56:34.149120 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:34.149086 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64d57abe-4892-46cb-bd51-25c5e8213a5b-cert\") pod \"odh-model-controller-858dbf95b8-rlcxz\" (UID: \"64d57abe-4892-46cb-bd51-25c5e8213a5b\") " pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" Apr 21 01:56:34.149291 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:34.149149 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsn92\" (UniqueName: \"kubernetes.io/projected/64d57abe-4892-46cb-bd51-25c5e8213a5b-kube-api-access-nsn92\") pod \"odh-model-controller-858dbf95b8-rlcxz\" (UID: \"64d57abe-4892-46cb-bd51-25c5e8213a5b\") " pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" Apr 21 01:56:34.250354 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:34.250318 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsn92\" (UniqueName: \"kubernetes.io/projected/64d57abe-4892-46cb-bd51-25c5e8213a5b-kube-api-access-nsn92\") pod \"odh-model-controller-858dbf95b8-rlcxz\" (UID: \"64d57abe-4892-46cb-bd51-25c5e8213a5b\") " pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" Apr 21 01:56:34.250529 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:34.250395 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64d57abe-4892-46cb-bd51-25c5e8213a5b-cert\") pod \"odh-model-controller-858dbf95b8-rlcxz\" (UID: \"64d57abe-4892-46cb-bd51-25c5e8213a5b\") " pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" Apr 21 01:56:34.250604 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:56:34.250519 2579 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 01:56:34.250604 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:56:34.250601 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64d57abe-4892-46cb-bd51-25c5e8213a5b-cert podName:64d57abe-4892-46cb-bd51-25c5e8213a5b nodeName:}" failed. No retries permitted until 2026-04-21 01:56:34.750578113 +0000 UTC m=+433.699669393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64d57abe-4892-46cb-bd51-25c5e8213a5b-cert") pod "odh-model-controller-858dbf95b8-rlcxz" (UID: "64d57abe-4892-46cb-bd51-25c5e8213a5b") : secret "odh-model-controller-webhook-cert" not found Apr 21 01:56:34.261704 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:34.261678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsn92\" (UniqueName: \"kubernetes.io/projected/64d57abe-4892-46cb-bd51-25c5e8213a5b-kube-api-access-nsn92\") pod \"odh-model-controller-858dbf95b8-rlcxz\" (UID: \"64d57abe-4892-46cb-bd51-25c5e8213a5b\") " pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" Apr 21 01:56:34.755782 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:34.755728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64d57abe-4892-46cb-bd51-25c5e8213a5b-cert\") pod \"odh-model-controller-858dbf95b8-rlcxz\" (UID: \"64d57abe-4892-46cb-bd51-25c5e8213a5b\") " pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" Apr 21 01:56:34.755967 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:56:34.755907 2579 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 01:56:34.756080 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:56:34.756000 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64d57abe-4892-46cb-bd51-25c5e8213a5b-cert podName:64d57abe-4892-46cb-bd51-25c5e8213a5b nodeName:}" failed. No retries permitted until 2026-04-21 01:56:35.755960898 +0000 UTC m=+434.705052175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64d57abe-4892-46cb-bd51-25c5e8213a5b-cert") pod "odh-model-controller-858dbf95b8-rlcxz" (UID: "64d57abe-4892-46cb-bd51-25c5e8213a5b") : secret "odh-model-controller-webhook-cert" not found Apr 21 01:56:34.951176 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:34.951138 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" event={"ID":"fda4a570-8f0b-47fa-8cdd-5dfbc383110d","Type":"ContainerStarted","Data":"9b817da40e58d51da3192bf7c056c36d10b1897a8a907c412f3abe8f4d723e2d"} Apr 21 01:56:34.966034 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:34.965958 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-56c874cd68-rzt28" podStartSLOduration=1.139760911 podStartE2EDuration="4.965945385s" podCreationTimestamp="2026-04-21 01:56:30 +0000 UTC" firstStartedPulling="2026-04-21 01:56:31.017501239 +0000 UTC m=+429.966592539" lastFinishedPulling="2026-04-21 01:56:34.84368574 +0000 UTC m=+433.792777013" observedRunningTime="2026-04-21 01:56:34.965246715 +0000 UTC m=+433.914338011" watchObservedRunningTime="2026-04-21 01:56:34.965945385 +0000 UTC m=+433.915036680" Apr 21 01:56:35.764611 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:35.764577 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64d57abe-4892-46cb-bd51-25c5e8213a5b-cert\") pod \"odh-model-controller-858dbf95b8-rlcxz\" (UID: \"64d57abe-4892-46cb-bd51-25c5e8213a5b\") " pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" Apr 21 01:56:35.767059 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:35.767038 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64d57abe-4892-46cb-bd51-25c5e8213a5b-cert\") pod \"odh-model-controller-858dbf95b8-rlcxz\" (UID: \"64d57abe-4892-46cb-bd51-25c5e8213a5b\") " pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" Apr 21 01:56:35.779088 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:35.779060 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" Apr 21 01:56:35.900667 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:35.900638 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-rlcxz"] Apr 21 01:56:35.903473 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:56:35.903448 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64d57abe_4892_46cb_bd51_25c5e8213a5b.slice/crio-ac4009398548338a5e1e2b4e48aca108dc051e3d176d68acba50ae61ae43a97a WatchSource:0}: Error finding container ac4009398548338a5e1e2b4e48aca108dc051e3d176d68acba50ae61ae43a97a: Status 404 returned error can't find the container with id ac4009398548338a5e1e2b4e48aca108dc051e3d176d68acba50ae61ae43a97a Apr 21 01:56:35.954989 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:35.954950 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" event={"ID":"64d57abe-4892-46cb-bd51-25c5e8213a5b","Type":"ContainerStarted","Data":"ac4009398548338a5e1e2b4e48aca108dc051e3d176d68acba50ae61ae43a97a"} Apr 21 01:56:39.839731 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:39.839698 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-xswb2"] Apr 21 01:56:39.842696 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:39.842680 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" Apr 21 01:56:39.845624 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:39.845599 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 21 01:56:39.845624 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:39.845617 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-tjpr8\"" Apr 21 01:56:39.852209 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:39.852161 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-xswb2"] Apr 21 01:56:39.896225 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:39.896191 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88c3287f-4a89-491a-9e89-2727be40c662-cert\") pod \"kserve-controller-manager-856948b99f-xswb2\" (UID: \"88c3287f-4a89-491a-9e89-2727be40c662\") " pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" Apr 21 01:56:39.896225 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:39.896222 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcwhf\" (UniqueName: \"kubernetes.io/projected/88c3287f-4a89-491a-9e89-2727be40c662-kube-api-access-qcwhf\") pod \"kserve-controller-manager-856948b99f-xswb2\" (UID: \"88c3287f-4a89-491a-9e89-2727be40c662\") " pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" Apr 21 01:56:39.967668 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:39.967638 2579 generic.go:358] "Generic (PLEG): container finished" podID="64d57abe-4892-46cb-bd51-25c5e8213a5b" containerID="855aa23d7fe1dec215820780edf5c5cf05db2c639d39a9c2a50037b9cf38053c" exitCode=1 Apr 21 01:56:39.967846 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:39.967723 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" event={"ID":"64d57abe-4892-46cb-bd51-25c5e8213a5b","Type":"ContainerDied","Data":"855aa23d7fe1dec215820780edf5c5cf05db2c639d39a9c2a50037b9cf38053c"} Apr 21 01:56:39.968023 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:39.968009 2579 scope.go:117] "RemoveContainer" containerID="855aa23d7fe1dec215820780edf5c5cf05db2c639d39a9c2a50037b9cf38053c" Apr 21 01:56:39.997204 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:39.997171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88c3287f-4a89-491a-9e89-2727be40c662-cert\") pod \"kserve-controller-manager-856948b99f-xswb2\" (UID: \"88c3287f-4a89-491a-9e89-2727be40c662\") " pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" Apr 21 01:56:39.997353 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:39.997207 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcwhf\" (UniqueName: \"kubernetes.io/projected/88c3287f-4a89-491a-9e89-2727be40c662-kube-api-access-qcwhf\") pod \"kserve-controller-manager-856948b99f-xswb2\" (UID: \"88c3287f-4a89-491a-9e89-2727be40c662\") " pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" Apr 21 01:56:39.997353 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:56:39.997306 2579 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 21 01:56:39.997486 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:56:39.997362 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88c3287f-4a89-491a-9e89-2727be40c662-cert podName:88c3287f-4a89-491a-9e89-2727be40c662 nodeName:}" failed. No retries permitted until 2026-04-21 01:56:40.49734758 +0000 UTC m=+439.446438853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88c3287f-4a89-491a-9e89-2727be40c662-cert") pod "kserve-controller-manager-856948b99f-xswb2" (UID: "88c3287f-4a89-491a-9e89-2727be40c662") : secret "kserve-webhook-server-cert" not found Apr 21 01:56:40.025351 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:40.025317 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcwhf\" (UniqueName: \"kubernetes.io/projected/88c3287f-4a89-491a-9e89-2727be40c662-kube-api-access-qcwhf\") pod \"kserve-controller-manager-856948b99f-xswb2\" (UID: \"88c3287f-4a89-491a-9e89-2727be40c662\") " pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" Apr 21 01:56:40.505761 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:40.505675 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88c3287f-4a89-491a-9e89-2727be40c662-cert\") pod \"kserve-controller-manager-856948b99f-xswb2\" (UID: \"88c3287f-4a89-491a-9e89-2727be40c662\") " pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" Apr 21 01:56:40.508176 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:40.508154 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88c3287f-4a89-491a-9e89-2727be40c662-cert\") pod \"kserve-controller-manager-856948b99f-xswb2\" (UID: \"88c3287f-4a89-491a-9e89-2727be40c662\") " pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" Apr 21 01:56:40.754748 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:40.754710 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" Apr 21 01:56:40.880257 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:40.880227 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-xswb2"] Apr 21 01:56:40.882686 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:56:40.882655 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88c3287f_4a89_491a_9e89_2727be40c662.slice/crio-7e43dd57bc6d733b3190d6c0244f8f6d430e304367d83ea2469c7ca95574b0d8 WatchSource:0}: Error finding container 7e43dd57bc6d733b3190d6c0244f8f6d430e304367d83ea2469c7ca95574b0d8: Status 404 returned error can't find the container with id 7e43dd57bc6d733b3190d6c0244f8f6d430e304367d83ea2469c7ca95574b0d8 Apr 21 01:56:40.971648 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:40.971617 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" event={"ID":"64d57abe-4892-46cb-bd51-25c5e8213a5b","Type":"ContainerStarted","Data":"1ad76e0e3ee5e6b8d3bc252c539d084c183b94c607beb6ff737f71b55b244a6c"} Apr 21 01:56:40.971835 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:40.971740 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" Apr 21 01:56:40.972698 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:40.972676 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" event={"ID":"88c3287f-4a89-491a-9e89-2727be40c662","Type":"ContainerStarted","Data":"7e43dd57bc6d733b3190d6c0244f8f6d430e304367d83ea2469c7ca95574b0d8"} Apr 21 01:56:40.988585 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:40.988540 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" podStartSLOduration=3.633562458 podStartE2EDuration="7.988526793s" podCreationTimestamp="2026-04-21 01:56:33 +0000 UTC" firstStartedPulling="2026-04-21 01:56:35.904732848 +0000 UTC m=+434.853824124" lastFinishedPulling="2026-04-21 01:56:40.259697186 +0000 UTC m=+439.208788459" observedRunningTime="2026-04-21 01:56:40.986730899 +0000 UTC m=+439.935822194" watchObservedRunningTime="2026-04-21 01:56:40.988526793 +0000 UTC m=+439.937618088" Apr 21 01:56:43.984005 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:43.983956 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" event={"ID":"88c3287f-4a89-491a-9e89-2727be40c662","Type":"ContainerStarted","Data":"f1dc332737b28dcbc77bd387370524b77afe89b5879bfc0c12bed89c045667ee"} Apr 21 01:56:43.984466 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:43.984086 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" Apr 21 01:56:44.005075 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:44.005026 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" podStartSLOduration=2.599599956 podStartE2EDuration="5.005011244s" podCreationTimestamp="2026-04-21 01:56:39 +0000 UTC" firstStartedPulling="2026-04-21 01:56:40.884088616 +0000 UTC m=+439.833179890" lastFinishedPulling="2026-04-21 01:56:43.289499905 +0000 UTC m=+442.238591178" observedRunningTime="2026-04-21 01:56:44.004263961 +0000 UTC m=+442.953355258" watchObservedRunningTime="2026-04-21 01:56:44.005011244 +0000 UTC m=+442.954102539" Apr 21 01:56:44.901284 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:44.901249 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr"] Apr 21 01:56:44.905510 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:44.905475 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" Apr 21 01:56:44.908625 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:44.908598 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-rvgk6\"" Apr 21 01:56:44.908625 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:44.908613 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 21 01:56:44.908625 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:44.908623 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 21 01:56:44.915617 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:44.915596 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr"] Apr 21 01:56:44.939876 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:44.939838 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v89x5\" (UniqueName: \"kubernetes.io/projected/7884a570-91ed-41ae-a30f-04e9f95855ba-kube-api-access-v89x5\") pod \"servicemesh-operator3-55f49c5f94-lh5vr\" (UID: \"7884a570-91ed-41ae-a30f-04e9f95855ba\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" Apr 21 01:56:44.939876 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:44.939884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/7884a570-91ed-41ae-a30f-04e9f95855ba-operator-config\") pod \"servicemesh-operator3-55f49c5f94-lh5vr\" (UID: \"7884a570-91ed-41ae-a30f-04e9f95855ba\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" Apr 21 01:56:45.040347 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:45.040308 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v89x5\" (UniqueName: \"kubernetes.io/projected/7884a570-91ed-41ae-a30f-04e9f95855ba-kube-api-access-v89x5\") pod \"servicemesh-operator3-55f49c5f94-lh5vr\" (UID: \"7884a570-91ed-41ae-a30f-04e9f95855ba\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" Apr 21 01:56:45.040803 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:45.040354 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/7884a570-91ed-41ae-a30f-04e9f95855ba-operator-config\") pod \"servicemesh-operator3-55f49c5f94-lh5vr\" (UID: \"7884a570-91ed-41ae-a30f-04e9f95855ba\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" Apr 21 01:56:45.042994 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:45.042944 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/7884a570-91ed-41ae-a30f-04e9f95855ba-operator-config\") pod \"servicemesh-operator3-55f49c5f94-lh5vr\" (UID: \"7884a570-91ed-41ae-a30f-04e9f95855ba\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" Apr 21 01:56:45.055997 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:45.055951 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v89x5\" (UniqueName: \"kubernetes.io/projected/7884a570-91ed-41ae-a30f-04e9f95855ba-kube-api-access-v89x5\") pod \"servicemesh-operator3-55f49c5f94-lh5vr\" (UID: \"7884a570-91ed-41ae-a30f-04e9f95855ba\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" Apr 21 01:56:45.217552 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:45.217462 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" Apr 21 01:56:45.350728 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:45.350698 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr"] Apr 21 01:56:45.352911 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:56:45.352871 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7884a570_91ed_41ae_a30f_04e9f95855ba.slice/crio-2f78cf14bad26ed917db172cc0344c26692aeabd48244f28da6cce3c2c0c88df WatchSource:0}: Error finding container 2f78cf14bad26ed917db172cc0344c26692aeabd48244f28da6cce3c2c0c88df: Status 404 returned error can't find the container with id 2f78cf14bad26ed917db172cc0344c26692aeabd48244f28da6cce3c2c0c88df Apr 21 01:56:45.990886 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:45.990852 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" event={"ID":"7884a570-91ed-41ae-a30f-04e9f95855ba","Type":"ContainerStarted","Data":"2f78cf14bad26ed917db172cc0344c26692aeabd48244f28da6cce3c2c0c88df"} Apr 21 01:56:50.004249 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:50.004215 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" event={"ID":"7884a570-91ed-41ae-a30f-04e9f95855ba","Type":"ContainerStarted","Data":"13e05989e27eec38c814a4bb0e70cda47d3366760e5782f5dc4bb3e26f1abc9a"} Apr 21 01:56:50.004691 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:50.004347 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" Apr 21 01:56:50.023561 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:50.023436 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" podStartSLOduration=2.087167483 podStartE2EDuration="6.023419937s" podCreationTimestamp="2026-04-21 01:56:44 +0000 UTC" firstStartedPulling="2026-04-21 01:56:45.355516799 +0000 UTC m=+444.304608087" lastFinishedPulling="2026-04-21 01:56:49.291769265 +0000 UTC m=+448.240860541" observedRunningTime="2026-04-21 01:56:50.022280475 +0000 UTC m=+448.971371769" watchObservedRunningTime="2026-04-21 01:56:50.023419937 +0000 UTC m=+448.972511233" Apr 21 01:56:51.979767 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:51.979732 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-rlcxz" Apr 21 01:56:56.123997 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.123938 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz"] Apr 21 01:56:56.127309 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.127289 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.130103 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.130081 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 21 01:56:56.130223 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.130131 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 01:56:56.130223 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.130170 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-qqsrw\"" Apr 21 01:56:56.130364 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.130347 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 21 01:56:56.130409 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.130375 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 21 01:56:56.135685 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.135660 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz"] Apr 21 01:56:56.220911 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.220879 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/15159b7f-80c1-423d-9922-0973f6795205-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.220911 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.220913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/15159b7f-80c1-423d-9922-0973f6795205-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.221121 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.220934 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjbw4\" (UniqueName: \"kubernetes.io/projected/15159b7f-80c1-423d-9922-0973f6795205-kube-api-access-qjbw4\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.221121 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.220957 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/15159b7f-80c1-423d-9922-0973f6795205-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.221121 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.221051 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/15159b7f-80c1-423d-9922-0973f6795205-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.221121 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.221072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/15159b7f-80c1-423d-9922-0973f6795205-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.221246 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.221128 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/15159b7f-80c1-423d-9922-0973f6795205-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.321680 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.321651 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/15159b7f-80c1-423d-9922-0973f6795205-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.321836 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.321686 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/15159b7f-80c1-423d-9922-0973f6795205-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.321836 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.321715 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjbw4\" (UniqueName: \"kubernetes.io/projected/15159b7f-80c1-423d-9922-0973f6795205-kube-api-access-qjbw4\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.321836 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.321749 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/15159b7f-80c1-423d-9922-0973f6795205-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.321836 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.321799 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/15159b7f-80c1-423d-9922-0973f6795205-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.321836 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.321826 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/15159b7f-80c1-423d-9922-0973f6795205-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.322107 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.321856 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/15159b7f-80c1-423d-9922-0973f6795205-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.322395 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.322369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/15159b7f-80c1-423d-9922-0973f6795205-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.324524 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.324491 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/15159b7f-80c1-423d-9922-0973f6795205-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.324621 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.324608 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/15159b7f-80c1-423d-9922-0973f6795205-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.324731 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.324710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/15159b7f-80c1-423d-9922-0973f6795205-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.325056 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.325037 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/15159b7f-80c1-423d-9922-0973f6795205-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.330698 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.330677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjbw4\" (UniqueName: \"kubernetes.io/projected/15159b7f-80c1-423d-9922-0973f6795205-kube-api-access-qjbw4\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.330773 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.330746 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/15159b7f-80c1-423d-9922-0973f6795205-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-b9mwz\" (UID: \"15159b7f-80c1-423d-9922-0973f6795205\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.439117 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.439031 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:56:56.571057 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:56.571027 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz"] Apr 21 01:56:56.574791 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:56:56.574760 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15159b7f_80c1_423d_9922_0973f6795205.slice/crio-96cb0aefe9395859cce0cea28f92e18376475726615bca081de06934af626aae WatchSource:0}: Error finding container 96cb0aefe9395859cce0cea28f92e18376475726615bca081de06934af626aae: Status 404 returned error can't find the container with id 96cb0aefe9395859cce0cea28f92e18376475726615bca081de06934af626aae Apr 21 01:56:57.028339 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:57.028305 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" event={"ID":"15159b7f-80c1-423d-9922-0973f6795205","Type":"ContainerStarted","Data":"96cb0aefe9395859cce0cea28f92e18376475726615bca081de06934af626aae"} Apr 21 01:56:59.696255 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:59.696214 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 21 01:56:59.696561 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:56:59.696278 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 21 01:57:00.040159 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:57:00.040117 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" event={"ID":"15159b7f-80c1-423d-9922-0973f6795205","Type":"ContainerStarted","Data":"fa83f4d03383f468ce77a308278418af6f6e06fc4de0fc31cb7b049b62a7c47d"} Apr 21 01:57:00.040396 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:57:00.040324 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:57:00.042150 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:57:00.042120 2579 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-b9mwz container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 21 01:57:00.042279 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:57:00.042178 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" podUID="15159b7f-80c1-423d-9922-0973f6795205" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 01:57:00.060274 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:57:00.060227 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" podStartSLOduration=0.941291655 podStartE2EDuration="4.060213761s" podCreationTimestamp="2026-04-21 01:56:56 +0000 UTC" firstStartedPulling="2026-04-21 01:56:56.577077786 +0000 UTC m=+455.526169063" lastFinishedPulling="2026-04-21 01:56:59.695999896 +0000 UTC m=+458.645091169" observedRunningTime="2026-04-21 01:57:00.05948282 +0000 UTC m=+459.008574117" watchObservedRunningTime="2026-04-21 01:57:00.060213761 +0000 UTC m=+459.009305057" Apr 21 01:57:01.010431 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:57:01.010402 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-lh5vr" Apr 21 01:57:01.044004 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:57:01.043948 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-b9mwz" Apr 21 01:57:14.991707 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:57:14.991674 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-xswb2" Apr 21 01:58:11.086649 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.086569 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5"] Apr 21 01:58:11.088674 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.088652 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" Apr 21 01:58:11.092278 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.092261 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-5vrzn\"" Apr 21 01:58:11.092506 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.092493 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 01:58:11.092655 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.092638 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 01:58:11.099964 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.099941 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5"] Apr 21 01:58:11.202341 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.202309 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pbdw\" (UniqueName: \"kubernetes.io/projected/66110ac7-61a3-4880-804e-13d9cafd8a53-kube-api-access-7pbdw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" (UID: \"66110ac7-61a3-4880-804e-13d9cafd8a53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" Apr 21 01:58:11.202510 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.202360 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/66110ac7-61a3-4880-804e-13d9cafd8a53-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" (UID: \"66110ac7-61a3-4880-804e-13d9cafd8a53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" Apr 21 01:58:11.303141 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.303104 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/66110ac7-61a3-4880-804e-13d9cafd8a53-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" (UID: \"66110ac7-61a3-4880-804e-13d9cafd8a53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" Apr 21 01:58:11.303310 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.303171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pbdw\" (UniqueName: \"kubernetes.io/projected/66110ac7-61a3-4880-804e-13d9cafd8a53-kube-api-access-7pbdw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" (UID: \"66110ac7-61a3-4880-804e-13d9cafd8a53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" Apr 21 01:58:11.303517 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.303497 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/66110ac7-61a3-4880-804e-13d9cafd8a53-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" (UID: \"66110ac7-61a3-4880-804e-13d9cafd8a53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" Apr 21 01:58:11.314615 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.314580 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pbdw\" (UniqueName: \"kubernetes.io/projected/66110ac7-61a3-4880-804e-13d9cafd8a53-kube-api-access-7pbdw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" (UID: \"66110ac7-61a3-4880-804e-13d9cafd8a53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" Apr 21 01:58:11.399669 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.399572 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" Apr 21 01:58:11.525935 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:11.525914 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5"] Apr 21 01:58:11.528340 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:58:11.528311 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66110ac7_61a3_4880_804e_13d9cafd8a53.slice/crio-3b738515971d7f6efc0a6c7c6f9f3bc7e0dddaf98b1c97bb1305b1f071c2a2ed WatchSource:0}: Error finding container 3b738515971d7f6efc0a6c7c6f9f3bc7e0dddaf98b1c97bb1305b1f071c2a2ed: Status 404 returned error can't find the container with id 3b738515971d7f6efc0a6c7c6f9f3bc7e0dddaf98b1c97bb1305b1f071c2a2ed Apr 21 01:58:12.265221 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:12.265186 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" event={"ID":"66110ac7-61a3-4880-804e-13d9cafd8a53","Type":"ContainerStarted","Data":"3b738515971d7f6efc0a6c7c6f9f3bc7e0dddaf98b1c97bb1305b1f071c2a2ed"} Apr 21 01:58:17.285136 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:17.283428 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" event={"ID":"66110ac7-61a3-4880-804e-13d9cafd8a53","Type":"ContainerStarted","Data":"2f2b7cdb01bb84f5cb9ca8e47eb553eb93f6a09b9532a8c1bd02a5eaf2327fed"} Apr 21 01:58:17.285136 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:17.284163 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" Apr 21 01:58:17.301651 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:17.301603 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" podStartSLOduration=0.611835733 podStartE2EDuration="6.301588489s" podCreationTimestamp="2026-04-21 01:58:11 +0000 UTC" firstStartedPulling="2026-04-21 01:58:11.530516327 +0000 UTC m=+530.479607601" lastFinishedPulling="2026-04-21 01:58:17.220269078 +0000 UTC m=+536.169360357" observedRunningTime="2026-04-21 01:58:17.299916637 +0000 UTC m=+536.249007932" watchObservedRunningTime="2026-04-21 01:58:17.301588489 +0000 UTC m=+536.250679785" Apr 21 01:58:29.292021 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:29.291986 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" Apr 21 01:58:30.198819 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.198785 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8"] Apr 21 01:58:30.201999 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.201961 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" Apr 21 01:58:30.210766 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.210736 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8"] Apr 21 01:58:30.352930 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.352895 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0b85d9b8-b2a4-4ece-aa03-2abce784b396-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" (UID: \"0b85d9b8-b2a4-4ece-aa03-2abce784b396\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" Apr 21 01:58:30.353324 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.352954 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrzmg\" (UniqueName: \"kubernetes.io/projected/0b85d9b8-b2a4-4ece-aa03-2abce784b396-kube-api-access-wrzmg\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" (UID: \"0b85d9b8-b2a4-4ece-aa03-2abce784b396\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" Apr 21 01:58:30.454170 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.454095 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrzmg\" (UniqueName: \"kubernetes.io/projected/0b85d9b8-b2a4-4ece-aa03-2abce784b396-kube-api-access-wrzmg\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" (UID: \"0b85d9b8-b2a4-4ece-aa03-2abce784b396\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" Apr 21 01:58:30.454170 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.454154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0b85d9b8-b2a4-4ece-aa03-2abce784b396-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" (UID: \"0b85d9b8-b2a4-4ece-aa03-2abce784b396\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" Apr 21 01:58:30.454495 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.454477 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0b85d9b8-b2a4-4ece-aa03-2abce784b396-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" (UID: \"0b85d9b8-b2a4-4ece-aa03-2abce784b396\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" Apr 21 01:58:30.463204 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.463181 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrzmg\" (UniqueName: \"kubernetes.io/projected/0b85d9b8-b2a4-4ece-aa03-2abce784b396-kube-api-access-wrzmg\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" (UID: \"0b85d9b8-b2a4-4ece-aa03-2abce784b396\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" Apr 21 01:58:30.512944 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.512915 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" Apr 21 01:58:30.631569 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.631544 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8"] Apr 21 01:58:30.634658 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:58:30.634630 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b85d9b8_b2a4_4ece_aa03_2abce784b396.slice/crio-f3684303dee4b3700dd541d11c06a66199679c43229ea065332af7e37e3064aa WatchSource:0}: Error finding container f3684303dee4b3700dd541d11c06a66199679c43229ea065332af7e37e3064aa: Status 404 returned error can't find the container with id f3684303dee4b3700dd541d11c06a66199679c43229ea065332af7e37e3064aa Apr 21 01:58:30.903538 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.903504 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5"] Apr 21 01:58:30.903778 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.903739 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" podUID="66110ac7-61a3-4880-804e-13d9cafd8a53" containerName="manager" containerID="cri-o://2f2b7cdb01bb84f5cb9ca8e47eb553eb93f6a09b9532a8c1bd02a5eaf2327fed" gracePeriod=2 Apr 21 01:58:30.912014 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.911987 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5"] Apr 21 01:58:30.922453 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.922427 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8"] Apr 21 01:58:30.923720 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.923684 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx"] Apr 21 01:58:30.924023 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.924011 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66110ac7-61a3-4880-804e-13d9cafd8a53" containerName="manager" Apr 21 01:58:30.924070 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.924026 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="66110ac7-61a3-4880-804e-13d9cafd8a53" containerName="manager" Apr 21 01:58:30.924104 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.924082 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="66110ac7-61a3-4880-804e-13d9cafd8a53" containerName="manager" Apr 21 01:58:30.927066 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.927045 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" Apr 21 01:58:30.927185 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.927171 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8"] Apr 21 01:58:30.929208 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.929178 2579 status_manager.go:895] "Failed to get status for pod" podUID="66110ac7-61a3-4880-804e-13d9cafd8a53" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:30.936586 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.936563 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx"] Apr 21 01:58:30.944235 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.944212 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6"] Apr 21 01:58:30.944743 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.944722 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" containerName="manager" Apr 21 01:58:30.944880 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.944869 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" containerName="manager" Apr 21 01:58:30.945097 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.945081 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" containerName="manager" Apr 21 01:58:30.947943 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.947913 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" Apr 21 01:58:30.953891 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.953861 2579 status_manager.go:895] "Failed to get status for pod" podUID="66110ac7-61a3-4880-804e-13d9cafd8a53" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:30.965645 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:30.965569 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6"] Apr 21 01:58:31.059743 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.059698 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q2fwx\" (UID: \"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" Apr 21 01:58:31.059892 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.059826 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f39dfa10-d009-4b2c-9160-2c1ae103d91e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-49fh6\" (UID: \"f39dfa10-d009-4b2c-9160-2c1ae103d91e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" Apr 21 01:58:31.059892 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.059859 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b559m\" (UniqueName: \"kubernetes.io/projected/f39dfa10-d009-4b2c-9160-2c1ae103d91e-kube-api-access-b559m\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-49fh6\" (UID: \"f39dfa10-d009-4b2c-9160-2c1ae103d91e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" Apr 21 01:58:31.059892 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.059884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrhvf\" (UniqueName: \"kubernetes.io/projected/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e-kube-api-access-zrhvf\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q2fwx\" (UID: \"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" Apr 21 01:58:31.117190 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.117165 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" Apr 21 01:58:31.119384 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.119359 2579 status_manager.go:895] "Failed to get status for pod" podUID="66110ac7-61a3-4880-804e-13d9cafd8a53" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:31.160617 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.160556 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f39dfa10-d009-4b2c-9160-2c1ae103d91e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-49fh6\" (UID: \"f39dfa10-d009-4b2c-9160-2c1ae103d91e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" Apr 21 01:58:31.160617 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.160586 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b559m\" (UniqueName: \"kubernetes.io/projected/f39dfa10-d009-4b2c-9160-2c1ae103d91e-kube-api-access-b559m\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-49fh6\" (UID: \"f39dfa10-d009-4b2c-9160-2c1ae103d91e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" Apr 21 01:58:31.160617 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.160608 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrhvf\" (UniqueName: \"kubernetes.io/projected/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e-kube-api-access-zrhvf\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q2fwx\" (UID: \"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" Apr 21 01:58:31.160798 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.160641 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q2fwx\" (UID: \"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" Apr 21 01:58:31.160959 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.160944 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q2fwx\" (UID: \"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" Apr 21 01:58:31.161031 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.160944 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f39dfa10-d009-4b2c-9160-2c1ae103d91e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-49fh6\" (UID: \"f39dfa10-d009-4b2c-9160-2c1ae103d91e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" Apr 21 01:58:31.168157 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.168134 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrhvf\" (UniqueName: \"kubernetes.io/projected/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e-kube-api-access-zrhvf\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q2fwx\" (UID: \"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" Apr 21 01:58:31.168347 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.168330 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b559m\" (UniqueName: \"kubernetes.io/projected/f39dfa10-d009-4b2c-9160-2c1ae103d91e-kube-api-access-b559m\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-49fh6\" (UID: \"f39dfa10-d009-4b2c-9160-2c1ae103d91e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" Apr 21 01:58:31.261817 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.261792 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/66110ac7-61a3-4880-804e-13d9cafd8a53-extensions-socket-volume\") pod \"66110ac7-61a3-4880-804e-13d9cafd8a53\" (UID: \"66110ac7-61a3-4880-804e-13d9cafd8a53\") " Apr 21 01:58:31.261940 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.261824 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pbdw\" (UniqueName: \"kubernetes.io/projected/66110ac7-61a3-4880-804e-13d9cafd8a53-kube-api-access-7pbdw\") pod \"66110ac7-61a3-4880-804e-13d9cafd8a53\" (UID: \"66110ac7-61a3-4880-804e-13d9cafd8a53\") " Apr 21 01:58:31.262161 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.262139 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66110ac7-61a3-4880-804e-13d9cafd8a53-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "66110ac7-61a3-4880-804e-13d9cafd8a53" (UID: "66110ac7-61a3-4880-804e-13d9cafd8a53"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:58:31.264020 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.264002 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66110ac7-61a3-4880-804e-13d9cafd8a53-kube-api-access-7pbdw" (OuterVolumeSpecName: "kube-api-access-7pbdw") pod "66110ac7-61a3-4880-804e-13d9cafd8a53" (UID: "66110ac7-61a3-4880-804e-13d9cafd8a53"). InnerVolumeSpecName "kube-api-access-7pbdw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:58:31.280156 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.280130 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" Apr 21 01:58:31.284864 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.284845 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" Apr 21 01:58:31.329038 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:58:31.328998 2579 kuberuntime_manager.go:623] "Missing actuated resource record" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" container="manager" Apr 21 01:58:31.330197 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.330156 2579 generic.go:358] "Generic (PLEG): container finished" podID="66110ac7-61a3-4880-804e-13d9cafd8a53" containerID="2f2b7cdb01bb84f5cb9ca8e47eb553eb93f6a09b9532a8c1bd02a5eaf2327fed" exitCode=0 Apr 21 01:58:31.330197 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.330196 2579 scope.go:117] "RemoveContainer" containerID="2f2b7cdb01bb84f5cb9ca8e47eb553eb93f6a09b9532a8c1bd02a5eaf2327fed" Apr 21 01:58:31.330383 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.330226 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" Apr 21 01:58:31.331278 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.331122 2579 status_manager.go:895] "Failed to get status for pod" podUID="66110ac7-61a3-4880-804e-13d9cafd8a53" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:31.333278 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.333249 2579 status_manager.go:895] "Failed to get status for pod" podUID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:31.335794 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.335594 2579 status_manager.go:895] "Failed to get status for pod" podUID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:31.338455 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.338308 2579 status_manager.go:895] "Failed to get status for pod" podUID="66110ac7-61a3-4880-804e-13d9cafd8a53" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:31.345366 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.345270 2579 scope.go:117] "RemoveContainer" containerID="2f2b7cdb01bb84f5cb9ca8e47eb553eb93f6a09b9532a8c1bd02a5eaf2327fed" Apr 21 01:58:31.350289 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:58:31.350158 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f2b7cdb01bb84f5cb9ca8e47eb553eb93f6a09b9532a8c1bd02a5eaf2327fed\": container with ID starting with 2f2b7cdb01bb84f5cb9ca8e47eb553eb93f6a09b9532a8c1bd02a5eaf2327fed not found: ID does not exist" containerID="2f2b7cdb01bb84f5cb9ca8e47eb553eb93f6a09b9532a8c1bd02a5eaf2327fed" Apr 21 01:58:31.350289 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.350195 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f2b7cdb01bb84f5cb9ca8e47eb553eb93f6a09b9532a8c1bd02a5eaf2327fed"} err="failed to get container status \"2f2b7cdb01bb84f5cb9ca8e47eb553eb93f6a09b9532a8c1bd02a5eaf2327fed\": rpc error: code = NotFound desc = could not find container \"2f2b7cdb01bb84f5cb9ca8e47eb553eb93f6a09b9532a8c1bd02a5eaf2327fed\": container with ID starting with 2f2b7cdb01bb84f5cb9ca8e47eb553eb93f6a09b9532a8c1bd02a5eaf2327fed not found: ID does not exist" Apr 21 01:58:31.357327 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.357305 2579 status_manager.go:895] "Failed to get status for pod" podUID="66110ac7-61a3-4880-804e-13d9cafd8a53" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:31.359492 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.359446 2579 status_manager.go:895] "Failed to get status for pod" podUID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:31.362697 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.362648 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/66110ac7-61a3-4880-804e-13d9cafd8a53-extensions-socket-volume\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:58:31.362697 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.362668 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7pbdw\" (UniqueName: \"kubernetes.io/projected/66110ac7-61a3-4880-804e-13d9cafd8a53-kube-api-access-7pbdw\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:58:31.427755 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.427727 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx"] Apr 21 01:58:31.430537 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:58:31.430514 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0fd8f95_4795_435a_98e5_b2a8ab1b3f3e.slice/crio-fea131e0b614c894fcbb5692e0776c3d64aa2e80735f3aff66645445299e8420 WatchSource:0}: Error finding container fea131e0b614c894fcbb5692e0776c3d64aa2e80735f3aff66645445299e8420: Status 404 returned error can't find the container with id fea131e0b614c894fcbb5692e0776c3d64aa2e80735f3aff66645445299e8420 Apr 21 01:58:31.442806 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.442781 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6"] Apr 21 01:58:31.448480 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:58:31.448454 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf39dfa10_d009_4b2c_9160_2c1ae103d91e.slice/crio-a97981c43c27b6661d3117cda494e0b4a9a9343371566a5cd0ac903ee935ba82 WatchSource:0}: Error finding container a97981c43c27b6661d3117cda494e0b4a9a9343371566a5cd0ac903ee935ba82: Status 404 returned error can't find the container with id a97981c43c27b6661d3117cda494e0b4a9a9343371566a5cd0ac903ee935ba82 Apr 21 01:58:31.709872 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.709789 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66110ac7-61a3-4880-804e-13d9cafd8a53" path="/var/lib/kubelet/pods/66110ac7-61a3-4880-804e-13d9cafd8a53/volumes" Apr 21 01:58:31.710431 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.710380 2579 status_manager.go:895] "Failed to get status for pod" podUID="66110ac7-61a3-4880-804e-13d9cafd8a53" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-v7nt5" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-v7nt5\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:31.712591 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:31.712565 2579 status_manager.go:895] "Failed to get status for pod" podUID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:32.340322 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.340283 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" event={"ID":"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e","Type":"ContainerStarted","Data":"bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe"} Apr 21 01:58:32.340322 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.340328 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" event={"ID":"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e","Type":"ContainerStarted","Data":"fea131e0b614c894fcbb5692e0776c3d64aa2e80735f3aff66645445299e8420"} Apr 21 01:58:32.340581 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.340392 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" Apr 21 01:58:32.341674 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.341644 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" podUID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" containerName="manager" containerID="cri-o://7aa5e007c7e69f6160c233d244d28dcc91dd7f5114ac34ca1c8672b767f81ff3" gracePeriod=2 Apr 21 01:58:32.343919 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.343882 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" event={"ID":"f39dfa10-d009-4b2c-9160-2c1ae103d91e","Type":"ContainerStarted","Data":"01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f"} Apr 21 01:58:32.343919 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.343915 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" event={"ID":"f39dfa10-d009-4b2c-9160-2c1ae103d91e","Type":"ContainerStarted","Data":"a97981c43c27b6661d3117cda494e0b4a9a9343371566a5cd0ac903ee935ba82"} Apr 21 01:58:32.344076 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.344019 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" Apr 21 01:58:32.361551 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.361513 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" podStartSLOduration=2.361499233 podStartE2EDuration="2.361499233s" podCreationTimestamp="2026-04-21 01:58:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:58:32.359540682 +0000 UTC m=+551.308631983" watchObservedRunningTime="2026-04-21 01:58:32.361499233 +0000 UTC m=+551.310590585" Apr 21 01:58:32.381783 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.381745 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" podStartSLOduration=2.3817276769999998 podStartE2EDuration="2.381727677s" podCreationTimestamp="2026-04-21 01:58:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:58:32.378919342 +0000 UTC m=+551.328010638" watchObservedRunningTime="2026-04-21 01:58:32.381727677 +0000 UTC m=+551.330818974" Apr 21 01:58:32.572936 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.572913 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" Apr 21 01:58:32.575211 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.575177 2579 status_manager.go:895] "Failed to get status for pod" podUID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:32.670794 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.670714 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0b85d9b8-b2a4-4ece-aa03-2abce784b396-extensions-socket-volume\") pod \"0b85d9b8-b2a4-4ece-aa03-2abce784b396\" (UID: \"0b85d9b8-b2a4-4ece-aa03-2abce784b396\") " Apr 21 01:58:32.670794 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.670775 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrzmg\" (UniqueName: \"kubernetes.io/projected/0b85d9b8-b2a4-4ece-aa03-2abce784b396-kube-api-access-wrzmg\") pod \"0b85d9b8-b2a4-4ece-aa03-2abce784b396\" (UID: \"0b85d9b8-b2a4-4ece-aa03-2abce784b396\") " Apr 21 01:58:32.671105 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.671080 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b85d9b8-b2a4-4ece-aa03-2abce784b396-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "0b85d9b8-b2a4-4ece-aa03-2abce784b396" (UID: "0b85d9b8-b2a4-4ece-aa03-2abce784b396"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:58:32.672986 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.672950 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b85d9b8-b2a4-4ece-aa03-2abce784b396-kube-api-access-wrzmg" (OuterVolumeSpecName: "kube-api-access-wrzmg") pod "0b85d9b8-b2a4-4ece-aa03-2abce784b396" (UID: "0b85d9b8-b2a4-4ece-aa03-2abce784b396"). InnerVolumeSpecName "kube-api-access-wrzmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:58:32.771997 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.771942 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wrzmg\" (UniqueName: \"kubernetes.io/projected/0b85d9b8-b2a4-4ece-aa03-2abce784b396-kube-api-access-wrzmg\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:58:32.771997 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:32.771995 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0b85d9b8-b2a4-4ece-aa03-2abce784b396-extensions-socket-volume\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:58:33.348096 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:33.348066 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" containerID="7aa5e007c7e69f6160c233d244d28dcc91dd7f5114ac34ca1c8672b767f81ff3" exitCode=0 Apr 21 01:58:33.348270 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:33.348116 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" Apr 21 01:58:33.348270 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:33.348169 2579 scope.go:117] "RemoveContainer" containerID="7aa5e007c7e69f6160c233d244d28dcc91dd7f5114ac34ca1c8672b767f81ff3" Apr 21 01:58:33.350417 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:33.350389 2579 status_manager.go:895] "Failed to get status for pod" podUID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:33.356335 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:33.356320 2579 scope.go:117] "RemoveContainer" containerID="7aa5e007c7e69f6160c233d244d28dcc91dd7f5114ac34ca1c8672b767f81ff3" Apr 21 01:58:33.356550 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:58:33.356534 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa5e007c7e69f6160c233d244d28dcc91dd7f5114ac34ca1c8672b767f81ff3\": container with ID starting with 7aa5e007c7e69f6160c233d244d28dcc91dd7f5114ac34ca1c8672b767f81ff3 not found: ID does not exist" containerID="7aa5e007c7e69f6160c233d244d28dcc91dd7f5114ac34ca1c8672b767f81ff3" Apr 21 01:58:33.356595 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:33.356558 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa5e007c7e69f6160c233d244d28dcc91dd7f5114ac34ca1c8672b767f81ff3"} err="failed to get container status \"7aa5e007c7e69f6160c233d244d28dcc91dd7f5114ac34ca1c8672b767f81ff3\": rpc error: code = NotFound desc = could not find container \"7aa5e007c7e69f6160c233d244d28dcc91dd7f5114ac34ca1c8672b767f81ff3\": container with ID starting with 7aa5e007c7e69f6160c233d244d28dcc91dd7f5114ac34ca1c8672b767f81ff3 not found: ID does not exist" Apr 21 01:58:33.358133 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:33.358109 2579 status_manager.go:895] "Failed to get status for pod" podUID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wljx8" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wljx8\" is forbidden: User \"system:node:ip-10-0-129-42.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-42.ec2.internal' and this object" Apr 21 01:58:33.710034 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:33.709948 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b85d9b8-b2a4-4ece-aa03-2abce784b396" path="/var/lib/kubelet/pods/0b85d9b8-b2a4-4ece-aa03-2abce784b396/volumes" Apr 21 01:58:43.350649 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.350615 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" Apr 21 01:58:43.351051 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.350664 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" Apr 21 01:58:43.414426 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.414387 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx"] Apr 21 01:58:43.414626 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.414605 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" podUID="f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e" containerName="manager" containerID="cri-o://bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe" gracePeriod=10 Apr 21 01:58:43.657879 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.657858 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" Apr 21 01:58:43.702951 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.702912 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk"] Apr 21 01:58:43.703218 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.703205 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e" containerName="manager" Apr 21 01:58:43.703218 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.703220 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e" containerName="manager" Apr 21 01:58:43.703290 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.703276 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e" containerName="manager" Apr 21 01:58:43.706121 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.706096 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" Apr 21 01:58:43.726189 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.726163 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk"] Apr 21 01:58:43.756146 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.756114 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e-extensions-socket-volume\") pod \"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e\" (UID: \"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e\") " Apr 21 01:58:43.756332 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.756176 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrhvf\" (UniqueName: \"kubernetes.io/projected/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e-kube-api-access-zrhvf\") pod \"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e\" (UID: \"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e\") " Apr 21 01:58:43.758825 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.756877 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e" (UID: "f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:58:43.762523 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.762493 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e-kube-api-access-zrhvf" (OuterVolumeSpecName: "kube-api-access-zrhvf") pod "f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e" (UID: "f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e"). InnerVolumeSpecName "kube-api-access-zrhvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:58:43.856734 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.856686 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9a69bba8-8791-4d69-bb06-5f4d7e7c178e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rdkkk\" (UID: \"9a69bba8-8791-4d69-bb06-5f4d7e7c178e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" Apr 21 01:58:43.856933 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.856773 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trwrq\" (UniqueName: \"kubernetes.io/projected/9a69bba8-8791-4d69-bb06-5f4d7e7c178e-kube-api-access-trwrq\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rdkkk\" (UID: \"9a69bba8-8791-4d69-bb06-5f4d7e7c178e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" Apr 21 01:58:43.856933 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.856842 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e-extensions-socket-volume\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:58:43.856933 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.856858 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zrhvf\" (UniqueName: \"kubernetes.io/projected/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e-kube-api-access-zrhvf\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:58:43.957302 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.957214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trwrq\" (UniqueName: \"kubernetes.io/projected/9a69bba8-8791-4d69-bb06-5f4d7e7c178e-kube-api-access-trwrq\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rdkkk\" (UID: \"9a69bba8-8791-4d69-bb06-5f4d7e7c178e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" Apr 21 01:58:43.957302 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.957290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9a69bba8-8791-4d69-bb06-5f4d7e7c178e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rdkkk\" (UID: \"9a69bba8-8791-4d69-bb06-5f4d7e7c178e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" Apr 21 01:58:43.957657 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.957640 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9a69bba8-8791-4d69-bb06-5f4d7e7c178e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rdkkk\" (UID: \"9a69bba8-8791-4d69-bb06-5f4d7e7c178e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" Apr 21 01:58:43.966794 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:43.966765 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trwrq\" (UniqueName: \"kubernetes.io/projected/9a69bba8-8791-4d69-bb06-5f4d7e7c178e-kube-api-access-trwrq\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rdkkk\" (UID: \"9a69bba8-8791-4d69-bb06-5f4d7e7c178e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" Apr 21 01:58:44.017663 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.017612 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" Apr 21 01:58:44.143932 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.143905 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk"] Apr 21 01:58:44.145655 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:58:44.145622 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a69bba8_8791_4d69_bb06_5f4d7e7c178e.slice/crio-0d9b1d7e5c695f4a2dce59d602a1ccced4f40584e27e520d5aafe8a1982fb8a5 WatchSource:0}: Error finding container 0d9b1d7e5c695f4a2dce59d602a1ccced4f40584e27e520d5aafe8a1982fb8a5: Status 404 returned error can't find the container with id 0d9b1d7e5c695f4a2dce59d602a1ccced4f40584e27e520d5aafe8a1982fb8a5 Apr 21 01:58:44.386004 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.385954 2579 generic.go:358] "Generic (PLEG): container finished" podID="f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e" containerID="bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe" exitCode=0 Apr 21 01:58:44.386457 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.386045 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" event={"ID":"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e","Type":"ContainerDied","Data":"bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe"} Apr 21 01:58:44.386457 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.386072 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" Apr 21 01:58:44.386457 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.386090 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx" event={"ID":"f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e","Type":"ContainerDied","Data":"fea131e0b614c894fcbb5692e0776c3d64aa2e80735f3aff66645445299e8420"} Apr 21 01:58:44.386457 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.386110 2579 scope.go:117] "RemoveContainer" containerID="bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe" Apr 21 01:58:44.387622 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.387598 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" event={"ID":"9a69bba8-8791-4d69-bb06-5f4d7e7c178e","Type":"ContainerStarted","Data":"334c1cb211d94937bd27da77ce7e7907f8b9674bdc1a72834f1ae9d92827138e"} Apr 21 01:58:44.387716 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.387633 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" event={"ID":"9a69bba8-8791-4d69-bb06-5f4d7e7c178e","Type":"ContainerStarted","Data":"0d9b1d7e5c695f4a2dce59d602a1ccced4f40584e27e520d5aafe8a1982fb8a5"} Apr 21 01:58:44.387716 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.387668 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" Apr 21 01:58:44.394701 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:58:44.394673 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0fd8f95_4795_435a_98e5_b2a8ab1b3f3e.slice\": RecentStats: unable to find data in memory cache]" Apr 21 01:58:44.394848 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:58:44.394819 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0fd8f95_4795_435a_98e5_b2a8ab1b3f3e.slice\": RecentStats: unable to find data in memory cache]" Apr 21 01:58:44.395564 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.395543 2579 scope.go:117] "RemoveContainer" containerID="bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe" Apr 21 01:58:44.395848 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:58:44.395825 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe\": container with ID starting with bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe not found: ID does not exist" containerID="bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe" Apr 21 01:58:44.395893 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.395862 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe"} err="failed to get container status \"bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe\": rpc error: code = NotFound desc = could not find container \"bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe\": container with ID starting with bd0afdeb1f21e20bd8c190f31c0f1562142e5ceb0a7464175a3dcb8853984bfe not found: ID does not exist" Apr 21 01:58:44.416715 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.416674 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" podStartSLOduration=1.416657129 podStartE2EDuration="1.416657129s" podCreationTimestamp="2026-04-21 01:58:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:58:44.416009807 +0000 UTC m=+563.365101108" watchObservedRunningTime="2026-04-21 01:58:44.416657129 +0000 UTC m=+563.365748426" Apr 21 01:58:44.433809 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.433781 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx"] Apr 21 01:58:44.446140 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:44.446118 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q2fwx"] Apr 21 01:58:45.709998 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:45.709953 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e" path="/var/lib/kubelet/pods/f0fd8f95-4795-435a-98e5-b2a8ab1b3f3e/volumes" Apr 21 01:58:55.395017 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:55.394960 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rdkkk" Apr 21 01:58:55.444453 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:55.444421 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6"] Apr 21 01:58:55.444673 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:55.444639 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" podUID="f39dfa10-d009-4b2c-9160-2c1ae103d91e" containerName="manager" containerID="cri-o://01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f" gracePeriod=10 Apr 21 01:58:55.687132 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:55.687110 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" Apr 21 01:58:55.849054 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:55.849020 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f39dfa10-d009-4b2c-9160-2c1ae103d91e-extensions-socket-volume\") pod \"f39dfa10-d009-4b2c-9160-2c1ae103d91e\" (UID: \"f39dfa10-d009-4b2c-9160-2c1ae103d91e\") " Apr 21 01:58:55.849228 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:55.849067 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b559m\" (UniqueName: \"kubernetes.io/projected/f39dfa10-d009-4b2c-9160-2c1ae103d91e-kube-api-access-b559m\") pod \"f39dfa10-d009-4b2c-9160-2c1ae103d91e\" (UID: \"f39dfa10-d009-4b2c-9160-2c1ae103d91e\") " Apr 21 01:58:55.849483 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:55.849452 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f39dfa10-d009-4b2c-9160-2c1ae103d91e-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "f39dfa10-d009-4b2c-9160-2c1ae103d91e" (UID: "f39dfa10-d009-4b2c-9160-2c1ae103d91e"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:58:55.851415 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:55.851388 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f39dfa10-d009-4b2c-9160-2c1ae103d91e-kube-api-access-b559m" (OuterVolumeSpecName: "kube-api-access-b559m") pod "f39dfa10-d009-4b2c-9160-2c1ae103d91e" (UID: "f39dfa10-d009-4b2c-9160-2c1ae103d91e"). InnerVolumeSpecName "kube-api-access-b559m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:58:55.949840 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:55.949769 2579 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f39dfa10-d009-4b2c-9160-2c1ae103d91e-extensions-socket-volume\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:58:55.949840 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:55.949794 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b559m\" (UniqueName: \"kubernetes.io/projected/f39dfa10-d009-4b2c-9160-2c1ae103d91e-kube-api-access-b559m\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:58:56.434710 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:56.434675 2579 generic.go:358] "Generic (PLEG): container finished" podID="f39dfa10-d009-4b2c-9160-2c1ae103d91e" containerID="01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f" exitCode=0 Apr 21 01:58:56.435196 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:56.434730 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" Apr 21 01:58:56.435196 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:56.434759 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" event={"ID":"f39dfa10-d009-4b2c-9160-2c1ae103d91e","Type":"ContainerDied","Data":"01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f"} Apr 21 01:58:56.435196 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:56.434799 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6" event={"ID":"f39dfa10-d009-4b2c-9160-2c1ae103d91e","Type":"ContainerDied","Data":"a97981c43c27b6661d3117cda494e0b4a9a9343371566a5cd0ac903ee935ba82"} Apr 21 01:58:56.435196 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:56.434816 2579 scope.go:117] "RemoveContainer" containerID="01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f" Apr 21 01:58:56.443483 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:56.443467 2579 scope.go:117] "RemoveContainer" containerID="01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f" Apr 21 01:58:56.443705 ip-10-0-129-42 kubenswrapper[2579]: E0421 01:58:56.443690 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f\": container with ID starting with 01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f not found: ID does not exist" containerID="01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f" Apr 21 01:58:56.443747 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:56.443713 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f"} err="failed to get container status \"01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f\": rpc error: code = NotFound desc = could not find container \"01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f\": container with ID starting with 01032790fd891771c1667d222f5900a3f21d1ea6dfab18ea4a460eee5488606f not found: ID does not exist" Apr 21 01:58:56.458313 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:56.458288 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6"] Apr 21 01:58:56.461654 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:56.461630 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-49fh6"] Apr 21 01:58:57.709986 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:57.709938 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f39dfa10-d009-4b2c-9160-2c1ae103d91e" path="/var/lib/kubelet/pods/f39dfa10-d009-4b2c-9160-2c1ae103d91e/volumes" Apr 21 01:58:59.816673 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.816645 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj"] Apr 21 01:58:59.817103 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.816932 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f39dfa10-d009-4b2c-9160-2c1ae103d91e" containerName="manager" Apr 21 01:58:59.817103 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.816945 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39dfa10-d009-4b2c-9160-2c1ae103d91e" containerName="manager" Apr 21 01:58:59.817103 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.817029 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f39dfa10-d009-4b2c-9160-2c1ae103d91e" containerName="manager" Apr 21 01:58:59.821189 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.821171 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:58:59.823634 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.823617 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-k6t95\"" Apr 21 01:58:59.830829 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.830808 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj"] Apr 21 01:58:59.975844 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.975803 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:58:59.975844 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.975843 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m87hm\" (UniqueName: \"kubernetes.io/projected/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-kube-api-access-m87hm\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:58:59.976084 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.975866 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:58:59.976084 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.975920 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:58:59.976084 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.975964 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:58:59.976084 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.976023 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:58:59.976084 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.976072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:58:59.976241 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.976093 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:58:59.976241 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:58:59.976154 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.077168 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.077077 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.077168 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.077124 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.077168 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.077144 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.077458 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.077263 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.077458 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.077325 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.077458 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.077370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m87hm\" (UniqueName: \"kubernetes.io/projected/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-kube-api-access-m87hm\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.077594 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.077570 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.077660 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.077638 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.077755 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.077733 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.077795 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.077733 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.077829 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.077795 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.077829 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.077824 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.078131 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.078103 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.078364 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.078341 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.079689 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.079665 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.080275 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.080253 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.084597 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.084569 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m87hm\" (UniqueName: \"kubernetes.io/projected/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-kube-api-access-m87hm\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.084597 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.084585 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f4a79fb3-19e4-4c44-9c2f-a8a08c87744a-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-5m5lj\" (UID: \"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.133529 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.133490 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:00.269144 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.269118 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj"] Apr 21 01:59:00.271739 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:59:00.271708 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4a79fb3_19e4_4c44_9c2f_a8a08c87744a.slice/crio-1abc0a14189963596edefbbe64d2cd9e841f87aae1e5772543c7fd4948fb9d19 WatchSource:0}: Error finding container 1abc0a14189963596edefbbe64d2cd9e841f87aae1e5772543c7fd4948fb9d19: Status 404 returned error can't find the container with id 1abc0a14189963596edefbbe64d2cd9e841f87aae1e5772543c7fd4948fb9d19 Apr 21 01:59:00.449546 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:00.449463 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" event={"ID":"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a","Type":"ContainerStarted","Data":"1abc0a14189963596edefbbe64d2cd9e841f87aae1e5772543c7fd4948fb9d19"} Apr 21 01:59:02.812343 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:02.812305 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 21 01:59:02.812655 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:02.812378 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 21 01:59:02.812655 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:02.812408 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 21 01:59:03.461439 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:03.461404 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" event={"ID":"f4a79fb3-19e4-4c44-9c2f-a8a08c87744a","Type":"ContainerStarted","Data":"a343e676ea1ed3a5b06d1c95af411c46f85125cda40f773319c0a7951ebc0a6f"} Apr 21 01:59:03.480497 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:03.480441 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" podStartSLOduration=1.942047361 podStartE2EDuration="4.480427889s" podCreationTimestamp="2026-04-21 01:58:59 +0000 UTC" firstStartedPulling="2026-04-21 01:59:00.273694009 +0000 UTC m=+579.222785281" lastFinishedPulling="2026-04-21 01:59:02.812074535 +0000 UTC m=+581.761165809" observedRunningTime="2026-04-21 01:59:03.478861369 +0000 UTC m=+582.427952663" watchObservedRunningTime="2026-04-21 01:59:03.480427889 +0000 UTC m=+582.429519183" Apr 21 01:59:04.134150 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:04.134119 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:04.138598 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:04.138573 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:04.465018 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:04.464919 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:04.465879 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:04.465861 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-5m5lj" Apr 21 01:59:14.460843 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.460802 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8cp4c"] Apr 21 01:59:14.464008 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.463986 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" Apr 21 01:59:14.466408 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.466388 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 01:59:14.466489 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.466411 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-bmpqw\"" Apr 21 01:59:14.470652 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.470630 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8cp4c"] Apr 21 01:59:14.477921 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.477889 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8cqj\" (UniqueName: \"kubernetes.io/projected/3f375d33-0f83-4443-b6c7-4aacfdbf64b4-kube-api-access-j8cqj\") pod \"limitador-limitador-7d549b5b-8cp4c\" (UID: \"3f375d33-0f83-4443-b6c7-4aacfdbf64b4\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" Apr 21 01:59:14.478048 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.477989 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3f375d33-0f83-4443-b6c7-4aacfdbf64b4-config-file\") pod \"limitador-limitador-7d549b5b-8cp4c\" (UID: \"3f375d33-0f83-4443-b6c7-4aacfdbf64b4\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" Apr 21 01:59:14.562703 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.562670 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8cp4c"] Apr 21 01:59:14.578553 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.578521 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8cqj\" (UniqueName: \"kubernetes.io/projected/3f375d33-0f83-4443-b6c7-4aacfdbf64b4-kube-api-access-j8cqj\") pod \"limitador-limitador-7d549b5b-8cp4c\" (UID: \"3f375d33-0f83-4443-b6c7-4aacfdbf64b4\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" Apr 21 01:59:14.578724 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.578568 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3f375d33-0f83-4443-b6c7-4aacfdbf64b4-config-file\") pod \"limitador-limitador-7d549b5b-8cp4c\" (UID: \"3f375d33-0f83-4443-b6c7-4aacfdbf64b4\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" Apr 21 01:59:14.579125 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.579107 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3f375d33-0f83-4443-b6c7-4aacfdbf64b4-config-file\") pod \"limitador-limitador-7d549b5b-8cp4c\" (UID: \"3f375d33-0f83-4443-b6c7-4aacfdbf64b4\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" Apr 21 01:59:14.586337 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.586314 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8cqj\" (UniqueName: \"kubernetes.io/projected/3f375d33-0f83-4443-b6c7-4aacfdbf64b4-kube-api-access-j8cqj\") pod \"limitador-limitador-7d549b5b-8cp4c\" (UID: \"3f375d33-0f83-4443-b6c7-4aacfdbf64b4\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" Apr 21 01:59:14.775683 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.775655 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" Apr 21 01:59:14.898303 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:14.898275 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8cp4c"] Apr 21 01:59:14.900736 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:59:14.900704 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f375d33_0f83_4443_b6c7_4aacfdbf64b4.slice/crio-85452fc765d20075d7377e6aee8f4fbe0a38440c594742452f3b890d31d483e3 WatchSource:0}: Error finding container 85452fc765d20075d7377e6aee8f4fbe0a38440c594742452f3b890d31d483e3: Status 404 returned error can't find the container with id 85452fc765d20075d7377e6aee8f4fbe0a38440c594742452f3b890d31d483e3 Apr 21 01:59:15.502327 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:15.502284 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" event={"ID":"3f375d33-0f83-4443-b6c7-4aacfdbf64b4","Type":"ContainerStarted","Data":"85452fc765d20075d7377e6aee8f4fbe0a38440c594742452f3b890d31d483e3"} Apr 21 01:59:18.514923 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:18.514891 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" event={"ID":"3f375d33-0f83-4443-b6c7-4aacfdbf64b4","Type":"ContainerStarted","Data":"83916e75c366004a225118b6a14decf268dcd23bbd46cab071ae75f222c340cb"} Apr 21 01:59:18.515350 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:18.514966 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" Apr 21 01:59:18.531888 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:18.531841 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" podStartSLOduration=1.9102248720000001 podStartE2EDuration="4.531826992s" podCreationTimestamp="2026-04-21 01:59:14 +0000 UTC" firstStartedPulling="2026-04-21 01:59:14.902663367 +0000 UTC m=+593.851754641" lastFinishedPulling="2026-04-21 01:59:17.524265486 +0000 UTC m=+596.473356761" observedRunningTime="2026-04-21 01:59:18.529697413 +0000 UTC m=+597.478788707" watchObservedRunningTime="2026-04-21 01:59:18.531826992 +0000 UTC m=+597.480918288" Apr 21 01:59:21.583908 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:21.583882 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-acl-logging/0.log" Apr 21 01:59:21.584381 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:21.584023 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-acl-logging/0.log" Apr 21 01:59:29.519113 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:29.519085 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" Apr 21 01:59:31.232799 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:31.232756 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8cp4c"] Apr 21 01:59:31.233336 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:31.233037 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" podUID="3f375d33-0f83-4443-b6c7-4aacfdbf64b4" containerName="limitador" containerID="cri-o://83916e75c366004a225118b6a14decf268dcd23bbd46cab071ae75f222c340cb" gracePeriod=30 Apr 21 01:59:31.553426 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:31.553388 2579 generic.go:358] "Generic (PLEG): container finished" podID="3f375d33-0f83-4443-b6c7-4aacfdbf64b4" containerID="83916e75c366004a225118b6a14decf268dcd23bbd46cab071ae75f222c340cb" exitCode=0 Apr 21 01:59:31.553546 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:31.553461 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" event={"ID":"3f375d33-0f83-4443-b6c7-4aacfdbf64b4","Type":"ContainerDied","Data":"83916e75c366004a225118b6a14decf268dcd23bbd46cab071ae75f222c340cb"} Apr 21 01:59:32.176816 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:32.176792 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" Apr 21 01:59:32.221749 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:32.221723 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3f375d33-0f83-4443-b6c7-4aacfdbf64b4-config-file\") pod \"3f375d33-0f83-4443-b6c7-4aacfdbf64b4\" (UID: \"3f375d33-0f83-4443-b6c7-4aacfdbf64b4\") " Apr 21 01:59:32.221885 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:32.221763 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8cqj\" (UniqueName: \"kubernetes.io/projected/3f375d33-0f83-4443-b6c7-4aacfdbf64b4-kube-api-access-j8cqj\") pod \"3f375d33-0f83-4443-b6c7-4aacfdbf64b4\" (UID: \"3f375d33-0f83-4443-b6c7-4aacfdbf64b4\") " Apr 21 01:59:32.222124 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:32.222102 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f375d33-0f83-4443-b6c7-4aacfdbf64b4-config-file" (OuterVolumeSpecName: "config-file") pod "3f375d33-0f83-4443-b6c7-4aacfdbf64b4" (UID: "3f375d33-0f83-4443-b6c7-4aacfdbf64b4"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:59:32.223933 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:32.223910 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f375d33-0f83-4443-b6c7-4aacfdbf64b4-kube-api-access-j8cqj" (OuterVolumeSpecName: "kube-api-access-j8cqj") pod "3f375d33-0f83-4443-b6c7-4aacfdbf64b4" (UID: "3f375d33-0f83-4443-b6c7-4aacfdbf64b4"). InnerVolumeSpecName "kube-api-access-j8cqj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:59:32.322556 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:32.322475 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8cqj\" (UniqueName: \"kubernetes.io/projected/3f375d33-0f83-4443-b6c7-4aacfdbf64b4-kube-api-access-j8cqj\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:59:32.322556 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:32.322507 2579 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3f375d33-0f83-4443-b6c7-4aacfdbf64b4-config-file\") on node \"ip-10-0-129-42.ec2.internal\" DevicePath \"\"" Apr 21 01:59:32.558766 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:32.558729 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" event={"ID":"3f375d33-0f83-4443-b6c7-4aacfdbf64b4","Type":"ContainerDied","Data":"85452fc765d20075d7377e6aee8f4fbe0a38440c594742452f3b890d31d483e3"} Apr 21 01:59:32.558766 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:32.558766 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8cp4c" Apr 21 01:59:32.559033 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:32.558778 2579 scope.go:117] "RemoveContainer" containerID="83916e75c366004a225118b6a14decf268dcd23bbd46cab071ae75f222c340cb" Apr 21 01:59:32.579098 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:32.579041 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8cp4c"] Apr 21 01:59:32.582915 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:32.582894 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8cp4c"] Apr 21 01:59:33.709742 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:33.709706 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f375d33-0f83-4443-b6c7-4aacfdbf64b4" path="/var/lib/kubelet/pods/3f375d33-0f83-4443-b6c7-4aacfdbf64b4/volumes" Apr 21 01:59:35.253313 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.253284 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-t86t6"] Apr 21 01:59:35.253797 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.253629 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f375d33-0f83-4443-b6c7-4aacfdbf64b4" containerName="limitador" Apr 21 01:59:35.253797 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.253644 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f375d33-0f83-4443-b6c7-4aacfdbf64b4" containerName="limitador" Apr 21 01:59:35.253797 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.253705 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f375d33-0f83-4443-b6c7-4aacfdbf64b4" containerName="limitador" Apr 21 01:59:35.257991 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.257958 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-t86t6" Apr 21 01:59:35.260380 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.260351 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 21 01:59:35.260594 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.260359 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-v2gdd\"" Apr 21 01:59:35.263162 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.263119 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-t86t6"] Apr 21 01:59:35.346911 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.346886 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4d211181-6bca-4b56-9c46-1597b20ad1a6-data\") pod \"postgres-868db5846d-t86t6\" (UID: \"4d211181-6bca-4b56-9c46-1597b20ad1a6\") " pod="opendatahub/postgres-868db5846d-t86t6" Apr 21 01:59:35.347114 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.346949 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcj9w\" (UniqueName: \"kubernetes.io/projected/4d211181-6bca-4b56-9c46-1597b20ad1a6-kube-api-access-zcj9w\") pod \"postgres-868db5846d-t86t6\" (UID: \"4d211181-6bca-4b56-9c46-1597b20ad1a6\") " pod="opendatahub/postgres-868db5846d-t86t6" Apr 21 01:59:35.448233 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.448201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4d211181-6bca-4b56-9c46-1597b20ad1a6-data\") pod \"postgres-868db5846d-t86t6\" (UID: \"4d211181-6bca-4b56-9c46-1597b20ad1a6\") " pod="opendatahub/postgres-868db5846d-t86t6" Apr 21 01:59:35.448410 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.448257 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcj9w\" (UniqueName: \"kubernetes.io/projected/4d211181-6bca-4b56-9c46-1597b20ad1a6-kube-api-access-zcj9w\") pod \"postgres-868db5846d-t86t6\" (UID: \"4d211181-6bca-4b56-9c46-1597b20ad1a6\") " pod="opendatahub/postgres-868db5846d-t86t6" Apr 21 01:59:35.448589 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.448567 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4d211181-6bca-4b56-9c46-1597b20ad1a6-data\") pod \"postgres-868db5846d-t86t6\" (UID: \"4d211181-6bca-4b56-9c46-1597b20ad1a6\") " pod="opendatahub/postgres-868db5846d-t86t6" Apr 21 01:59:35.455854 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.455820 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcj9w\" (UniqueName: \"kubernetes.io/projected/4d211181-6bca-4b56-9c46-1597b20ad1a6-kube-api-access-zcj9w\") pod \"postgres-868db5846d-t86t6\" (UID: \"4d211181-6bca-4b56-9c46-1597b20ad1a6\") " pod="opendatahub/postgres-868db5846d-t86t6" Apr 21 01:59:35.572906 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.572808 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-t86t6" Apr 21 01:59:35.693040 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:35.693013 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-t86t6"] Apr 21 01:59:35.695584 ip-10-0-129-42 kubenswrapper[2579]: W0421 01:59:35.695553 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d211181_6bca_4b56_9c46_1597b20ad1a6.slice/crio-e2c804b87ed95b022d16e83e8217ebf86c51f132209708f85977205fe9bc0a52 WatchSource:0}: Error finding container e2c804b87ed95b022d16e83e8217ebf86c51f132209708f85977205fe9bc0a52: Status 404 returned error can't find the container with id e2c804b87ed95b022d16e83e8217ebf86c51f132209708f85977205fe9bc0a52 Apr 21 01:59:36.578719 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:36.578685 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-t86t6" event={"ID":"4d211181-6bca-4b56-9c46-1597b20ad1a6","Type":"ContainerStarted","Data":"e2c804b87ed95b022d16e83e8217ebf86c51f132209708f85977205fe9bc0a52"} Apr 21 01:59:44.607326 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:44.607291 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-t86t6" event={"ID":"4d211181-6bca-4b56-9c46-1597b20ad1a6","Type":"ContainerStarted","Data":"2e2cf6eda237308551b73110f3ef6eceba97b3646cef8ea9e0effde17545bcf7"} Apr 21 01:59:44.607691 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:44.607392 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-t86t6" Apr 21 01:59:44.622537 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:44.622443 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-t86t6" podStartSLOduration=0.959064671 podStartE2EDuration="9.622429004s" podCreationTimestamp="2026-04-21 01:59:35 +0000 UTC" firstStartedPulling="2026-04-21 01:59:35.69683021 +0000 UTC m=+614.645921487" lastFinishedPulling="2026-04-21 01:59:44.360194544 +0000 UTC m=+623.309285820" observedRunningTime="2026-04-21 01:59:44.621124336 +0000 UTC m=+623.570215630" watchObservedRunningTime="2026-04-21 01:59:44.622429004 +0000 UTC m=+623.571520298" Apr 21 01:59:50.639348 ip-10-0-129-42 kubenswrapper[2579]: I0421 01:59:50.639314 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-t86t6" Apr 21 02:00:01.716037 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:01.716001 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-k9pbh"] Apr 21 02:00:01.719328 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:01.719309 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-k9pbh" Apr 21 02:00:01.721934 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:01.721912 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 21 02:00:01.723066 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:01.723046 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-f94xv\"" Apr 21 02:00:01.723168 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:01.723069 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 21 02:00:01.726281 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:01.726244 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-k9pbh"] Apr 21 02:00:01.876667 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:01.876619 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fs5q\" (UniqueName: \"kubernetes.io/projected/19ca1381-9114-4fe6-907f-86d95ad13dc6-kube-api-access-8fs5q\") pod \"keycloak-operator-5c4df598dd-k9pbh\" (UID: \"19ca1381-9114-4fe6-907f-86d95ad13dc6\") " pod="keycloak-system/keycloak-operator-5c4df598dd-k9pbh" Apr 21 02:00:01.977967 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:01.977932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fs5q\" (UniqueName: \"kubernetes.io/projected/19ca1381-9114-4fe6-907f-86d95ad13dc6-kube-api-access-8fs5q\") pod \"keycloak-operator-5c4df598dd-k9pbh\" (UID: \"19ca1381-9114-4fe6-907f-86d95ad13dc6\") " pod="keycloak-system/keycloak-operator-5c4df598dd-k9pbh" Apr 21 02:00:01.991287 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:01.991252 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fs5q\" (UniqueName: \"kubernetes.io/projected/19ca1381-9114-4fe6-907f-86d95ad13dc6-kube-api-access-8fs5q\") pod \"keycloak-operator-5c4df598dd-k9pbh\" (UID: \"19ca1381-9114-4fe6-907f-86d95ad13dc6\") " pod="keycloak-system/keycloak-operator-5c4df598dd-k9pbh" Apr 21 02:00:02.032353 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:02.032313 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-k9pbh" Apr 21 02:00:02.173044 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:02.173009 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-k9pbh"] Apr 21 02:00:02.176368 ip-10-0-129-42 kubenswrapper[2579]: W0421 02:00:02.176341 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19ca1381_9114_4fe6_907f_86d95ad13dc6.slice/crio-942374b381253f9bde2450849211fc3eb9bfc9eb875bb39a24bea028585de497 WatchSource:0}: Error finding container 942374b381253f9bde2450849211fc3eb9bfc9eb875bb39a24bea028585de497: Status 404 returned error can't find the container with id 942374b381253f9bde2450849211fc3eb9bfc9eb875bb39a24bea028585de497 Apr 21 02:00:02.671899 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:02.671860 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-k9pbh" event={"ID":"19ca1381-9114-4fe6-907f-86d95ad13dc6","Type":"ContainerStarted","Data":"942374b381253f9bde2450849211fc3eb9bfc9eb875bb39a24bea028585de497"} Apr 21 02:00:08.692509 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:08.692466 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-k9pbh" event={"ID":"19ca1381-9114-4fe6-907f-86d95ad13dc6","Type":"ContainerStarted","Data":"6b765171d75f9586ae6dd1ee3faf76929e74d82dfee5a57ab3425104197c5a57"} Apr 21 02:00:08.708843 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:00:08.708792 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-k9pbh" podStartSLOduration=2.091807302 podStartE2EDuration="7.708778886s" podCreationTimestamp="2026-04-21 02:00:01 +0000 UTC" firstStartedPulling="2026-04-21 02:00:02.17765014 +0000 UTC m=+641.126741414" lastFinishedPulling="2026-04-21 02:00:07.794621722 +0000 UTC m=+646.743712998" observedRunningTime="2026-04-21 02:00:08.706701883 +0000 UTC m=+647.655793178" watchObservedRunningTime="2026-04-21 02:00:08.708778886 +0000 UTC m=+647.657870224" Apr 21 02:04:10.299166 ip-10-0-129-42 kubenswrapper[2579]: E0421 02:04:10.299144 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/systemd-tmpfiles-clean.service\": RecentStats: unable to find data in memory cache]" Apr 21 02:04:21.604990 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:21.604944 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-acl-logging/0.log" Apr 21 02:04:21.605536 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:21.605518 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-acl-logging/0.log" Apr 21 02:04:28.073938 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:28.073901 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-xswb2_88c3287f-4a89-491a-9e89-2727be40c662/manager/0.log" Apr 21 02:04:28.429014 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:28.428929 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-rlcxz_64d57abe-4892-46cb-bd51-25c5e8213a5b/manager/1.log" Apr 21 02:04:28.649009 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:28.648958 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-64bbc69db5-92n2w_6e215592-89d0-4593-9c7d-edec627230a9/manager/0.log" Apr 21 02:04:28.870334 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:28.870291 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-t86t6_4d211181-6bca-4b56-9c46-1597b20ad1a6/postgres/0.log" Apr 21 02:04:30.709478 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:30.709445 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-rdkkk_9a69bba8-8791-4d69-bb06-5f4d7e7c178e/manager/0.log" Apr 21 02:04:31.383533 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:31.383501 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-b9mwz_15159b7f-80c1-423d-9922-0973f6795205/discovery/0.log" Apr 21 02:04:31.595760 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:31.595728 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-56c874cd68-rzt28_fda4a570-8f0b-47fa-8cdd-5dfbc383110d/kube-auth-proxy/0.log" Apr 21 02:04:31.712535 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:31.712460 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-5m5lj_f4a79fb3-19e4-4c44-9c2f-a8a08c87744a/istio-proxy/0.log" Apr 21 02:04:39.346574 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:39.346541 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9rpdw_ccbf8a05-33ed-4417-b280-79fda195d53d/global-pull-secret-syncer/0.log" Apr 21 02:04:39.430100 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:39.430069 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jvwm9_e0901c2f-3720-4876-9776-5d298a25b8a3/konnectivity-agent/0.log" Apr 21 02:04:39.498813 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:39.498778 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-42.ec2.internal_90d1b61eadce6e4cc6b47dcca75f1db0/haproxy/0.log" Apr 21 02:04:43.664818 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:43.664776 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-rdkkk_9a69bba8-8791-4d69-bb06-5f4d7e7c178e/manager/0.log" Apr 21 02:04:45.429203 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:45.429170 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vfbfv_0087979a-093e-49eb-8643-f323f304cd76/node-exporter/0.log" Apr 21 02:04:45.453281 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:45.453258 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vfbfv_0087979a-093e-49eb-8643-f323f304cd76/kube-rbac-proxy/0.log" Apr 21 02:04:45.471232 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:45.471209 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vfbfv_0087979a-093e-49eb-8643-f323f304cd76/init-textfile/0.log" Apr 21 02:04:48.044101 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.044057 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s"] Apr 21 02:04:48.047310 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.047289 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.049654 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.049632 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5xmbv\"/\"default-dockercfg-whz96\"" Apr 21 02:04:48.050743 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.050717 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5xmbv\"/\"openshift-service-ca.crt\"" Apr 21 02:04:48.050814 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.050777 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5xmbv\"/\"kube-root-ca.crt\"" Apr 21 02:04:48.056013 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.055966 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s"] Apr 21 02:04:48.127741 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.127701 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-podres\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.127900 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.127794 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-sys\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.127900 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.127850 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgcvn\" (UniqueName: \"kubernetes.io/projected/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-kube-api-access-fgcvn\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.127900 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.127876 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-lib-modules\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.128061 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.127908 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-proc\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.229354 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.229316 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgcvn\" (UniqueName: \"kubernetes.io/projected/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-kube-api-access-fgcvn\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.229354 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.229357 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-lib-modules\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.229603 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.229373 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-proc\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.229603 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.229463 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-proc\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.229603 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.229527 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-lib-modules\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.229603 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.229540 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-podres\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.229603 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.229602 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-sys\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.229803 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.229611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-podres\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.229803 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.229648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-sys\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.237501 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.237471 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgcvn\" (UniqueName: \"kubernetes.io/projected/dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071-kube-api-access-fgcvn\") pod \"perf-node-gather-daemonset-zsb9s\" (UID: \"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.357398 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.357318 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:48.478847 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.478807 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s"] Apr 21 02:04:48.485543 ip-10-0-129-42 kubenswrapper[2579]: W0421 02:04:48.485514 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddec1be4b_bc7d_49dd_aeaf_9b4d50c1d071.slice/crio-3c5f9873028badca7873fd04442d7a0e69a8def00a821975926b6821d1672569 WatchSource:0}: Error finding container 3c5f9873028badca7873fd04442d7a0e69a8def00a821975926b6821d1672569: Status 404 returned error can't find the container with id 3c5f9873028badca7873fd04442d7a0e69a8def00a821975926b6821d1672569 Apr 21 02:04:48.487115 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.487089 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 02:04:48.591223 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:48.591190 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" event={"ID":"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071","Type":"ContainerStarted","Data":"3c5f9873028badca7873fd04442d7a0e69a8def00a821975926b6821d1672569"} Apr 21 02:04:49.575640 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:49.575608 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hwqnv_1bdc049e-4435-4a8a-a927-c21e9eb190a6/dns/0.log" Apr 21 02:04:49.595841 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:49.595803 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" event={"ID":"dec1be4b-bc7d-49dd-aeaf-9b4d50c1d071","Type":"ContainerStarted","Data":"0424ad86d36c09becf987c63fa54877ebe6e4a7422984b776d58a750d416a8f2"} Apr 21 02:04:49.596045 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:49.596032 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:04:49.601523 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:49.601504 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hwqnv_1bdc049e-4435-4a8a-a927-c21e9eb190a6/kube-rbac-proxy/0.log" Apr 21 02:04:49.610919 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:49.610882 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" podStartSLOduration=1.6108669899999999 podStartE2EDuration="1.61086699s" podCreationTimestamp="2026-04-21 02:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:04:49.610479165 +0000 UTC m=+928.559570459" watchObservedRunningTime="2026-04-21 02:04:49.61086699 +0000 UTC m=+928.559958356" Apr 21 02:04:49.667551 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:49.667524 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ndnpw_a45eec20-9193-41a2-a65b-20a0623f23d5/dns-node-resolver/0.log" Apr 21 02:04:50.285248 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:50.285220 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wv5c7_69b70ba3-d144-4882-86ae-cb9a2e4391a4/node-ca/0.log" Apr 21 02:04:51.108325 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:51.108293 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-b9mwz_15159b7f-80c1-423d-9922-0973f6795205/discovery/0.log" Apr 21 02:04:51.148253 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:51.148222 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-56c874cd68-rzt28_fda4a570-8f0b-47fa-8cdd-5dfbc383110d/kube-auth-proxy/0.log" Apr 21 02:04:51.176963 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:51.176933 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-5m5lj_f4a79fb3-19e4-4c44-9c2f-a8a08c87744a/istio-proxy/0.log" Apr 21 02:04:51.697755 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:51.697723 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r2p22_70ae3e7d-6300-4ae1-b1f2-a24f5fff0bfc/serve-healthcheck-canary/0.log" Apr 21 02:04:52.197700 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:52.197672 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j5d6m_413a0815-b025-4937-bb22-c070bfb35757/kube-rbac-proxy/0.log" Apr 21 02:04:52.216444 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:52.216420 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j5d6m_413a0815-b025-4937-bb22-c070bfb35757/exporter/0.log" Apr 21 02:04:52.234613 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:52.234583 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j5d6m_413a0815-b025-4937-bb22-c070bfb35757/extractor/0.log" Apr 21 02:04:54.157845 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:54.157794 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-xswb2_88c3287f-4a89-491a-9e89-2727be40c662/manager/0.log" Apr 21 02:04:54.236242 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:54.236211 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-rlcxz_64d57abe-4892-46cb-bd51-25c5e8213a5b/manager/0.log" Apr 21 02:04:54.247167 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:54.247141 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-rlcxz_64d57abe-4892-46cb-bd51-25c5e8213a5b/manager/1.log" Apr 21 02:04:54.303491 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:54.303459 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-64bbc69db5-92n2w_6e215592-89d0-4593-9c7d-edec627230a9/manager/0.log" Apr 21 02:04:54.367956 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:54.367929 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-t86t6_4d211181-6bca-4b56-9c46-1597b20ad1a6/postgres/0.log" Apr 21 02:04:55.610251 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:04:55.610216 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-zsb9s" Apr 21 02:05:01.308743 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:01.308716 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljkm7_16bf952c-6936-44eb-a262-16dac395d351/kube-multus-additional-cni-plugins/0.log" Apr 21 02:05:01.348415 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:01.348387 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljkm7_16bf952c-6936-44eb-a262-16dac395d351/egress-router-binary-copy/0.log" Apr 21 02:05:01.366604 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:01.366581 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljkm7_16bf952c-6936-44eb-a262-16dac395d351/cni-plugins/0.log" Apr 21 02:05:01.385026 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:01.385000 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljkm7_16bf952c-6936-44eb-a262-16dac395d351/bond-cni-plugin/0.log" Apr 21 02:05:01.406794 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:01.406770 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljkm7_16bf952c-6936-44eb-a262-16dac395d351/routeoverride-cni/0.log" Apr 21 02:05:01.425348 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:01.425324 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljkm7_16bf952c-6936-44eb-a262-16dac395d351/whereabouts-cni-bincopy/0.log" Apr 21 02:05:01.446175 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:01.446145 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ljkm7_16bf952c-6936-44eb-a262-16dac395d351/whereabouts-cni/0.log" Apr 21 02:05:01.607711 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:01.607630 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cxcst_18adaff4-c987-4e36-8131-4023c56e79c0/kube-multus/0.log" Apr 21 02:05:01.679604 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:01.679560 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lwrh2_202b33ea-296c-464b-8ee5-774d048859c2/network-metrics-daemon/0.log" Apr 21 02:05:01.696840 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:01.696811 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lwrh2_202b33ea-296c-464b-8ee5-774d048859c2/kube-rbac-proxy/0.log" Apr 21 02:05:02.783129 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:02.783097 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-controller/0.log" Apr 21 02:05:02.799569 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:02.799541 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-acl-logging/0.log" Apr 21 02:05:02.803505 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:02.803485 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovn-acl-logging/1.log" Apr 21 02:05:02.820357 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:02.820318 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/kube-rbac-proxy-node/0.log" Apr 21 02:05:02.838462 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:02.838438 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 02:05:02.856201 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:02.856175 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/northd/0.log" Apr 21 02:05:02.874192 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:02.874170 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/nbdb/0.log" Apr 21 02:05:02.892505 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:02.892489 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/sbdb/0.log" Apr 21 02:05:02.983484 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:02.983457 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d79xd_4f79c179-ef2b-42fc-ae2f-cbf567b17a05/ovnkube-controller/0.log" Apr 21 02:05:04.324437 ip-10-0-129-42 kubenswrapper[2579]: I0421 02:05:04.324411 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-tn9c4_a1f54627-ed53-4b78-85cb-f07dd7bc3869/network-check-target-container/0.log"