Apr 22 19:55:21.293263 ip-10-0-135-72 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 19:55:21.293274 ip-10-0-135-72 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 19:55:21.293281 ip-10-0-135-72 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 19:55:21.293518 ip-10-0-135-72 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 19:55:31.457520 ip-10-0-135-72 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 19:55:31.457535 ip-10-0-135-72 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot cb1e061c79da4d8daae46ff71ca504f2 -- Apr 22 19:57:45.352101 ip-10-0-135-72 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:57:45.775110 ip-10-0-135-72 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:45.775110 ip-10-0-135-72 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:57:45.775110 ip-10-0-135-72 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:45.775110 ip-10-0-135-72 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:57:45.775110 ip-10-0-135-72 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:45.777989 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.777897 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:57:45.780254 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780238 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:45.780254 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780254 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780258 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780261 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780264 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780266 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780269 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780273 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780276 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780278 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780281 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780285 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780287 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780290 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780293 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780296 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780299 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780301 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780304 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780306 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780309 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:45.780317 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780311 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780314 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780318 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780320 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780323 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780326 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780329 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780332 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780334 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780337 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780340 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780343 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780345 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780348 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780350 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780353 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780356 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780358 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780361 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780363 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:45.780793 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780366 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780368 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780371 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780373 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780375 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780378 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780380 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780384 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780387 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780389 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780391 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780394 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780397 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780399 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780403 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780405 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780409 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780411 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780413 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780416 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:45.781308 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780418 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780421 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780423 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780426 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780428 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780431 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780436 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780440 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780444 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780447 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780451 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780455 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780458 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780461 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780464 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780466 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780469 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780472 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780474 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:45.781787 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780477 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780480 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780482 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780485 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780487 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780490 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780905 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780912 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780915 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780917 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780920 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780922 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780925 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780927 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780930 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780932 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780935 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780937 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780940 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780943 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780945 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:45.782250 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780947 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780950 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780953 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780955 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780957 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780960 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780962 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780964 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780967 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780969 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780972 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780975 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780978 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780980 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780983 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780985 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780988 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780990 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780993 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:45.782761 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.780997 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781001 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781003 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781006 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781009 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781011 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781016 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781019 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781022 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781025 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781028 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781031 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781033 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781036 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781039 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781041 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781044 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781047 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781049 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781052 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:45.783340 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781054 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781057 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781059 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781063 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781065 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781068 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781071 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781074 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781077 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781079 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781082 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781084 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781088 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781091 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781093 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781095 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781098 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781101 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781103 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:45.783852 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781105 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781108 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781110 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781113 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781115 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781117 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781120 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781122 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781125 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781127 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781129 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781132 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.781134 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.781991 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782000 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782007 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782012 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782017 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782020 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782025 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782030 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:57:45.784314 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782034 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782037 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782041 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782044 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782047 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782050 2578 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782053 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782056 2578 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782059 2578 flags.go:64] FLAG: --cloud-config="" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782062 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782065 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782069 2578 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782072 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782075 2578 flags.go:64] FLAG: --config-dir="" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782077 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782081 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782085 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782088 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782091 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782094 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782097 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782100 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782103 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782106 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782109 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:57:45.784846 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782114 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782117 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782120 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782123 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782126 2578 flags.go:64] FLAG: --enable-server="true" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782129 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782134 2578 flags.go:64] FLAG: --event-burst="100" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782138 2578 flags.go:64] FLAG: --event-qps="50" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782141 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782144 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782147 2578 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782151 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782153 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782157 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782160 2578 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782162 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782165 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782168 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782171 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782174 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782176 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782179 2578 flags.go:64] FLAG: --feature-gates="" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782183 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782186 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782189 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:57:45.785458 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782192 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782195 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782198 2578 flags.go:64] FLAG: --help="false" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782201 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-135-72.ec2.internal" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782204 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782207 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782210 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782213 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782217 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782219 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782222 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782225 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782228 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782230 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782233 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782237 2578 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782239 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782242 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782245 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782248 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782251 2578 flags.go:64] FLAG: --lock-file="" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782254 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782257 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782260 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:57:45.786071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782265 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782269 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782272 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782274 2578 flags.go:64] FLAG: --logging-format="text" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782277 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782281 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782284 2578 flags.go:64] FLAG: --manifest-url="" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782287 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782291 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782294 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782298 2578 flags.go:64] FLAG: --max-pods="110" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782300 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782303 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782306 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782309 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782312 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782315 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782318 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782326 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782329 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782332 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782335 2578 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782338 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:57:45.786674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782343 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782346 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782350 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782353 2578 flags.go:64] FLAG: --port="10250" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782356 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782359 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b9fe7bb9c10bfb51" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782362 2578 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782365 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782368 2578 flags.go:64] FLAG: --register-node="true" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782371 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782377 2578 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782381 2578 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782383 2578 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782386 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782389 2578 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782393 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782396 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782399 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782402 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782405 2578 flags.go:64] FLAG: --runonce="false" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782408 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782411 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782415 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782418 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782421 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782423 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:57:45.787251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782427 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782430 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782433 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782436 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782438 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782441 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782444 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782447 2578 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782454 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782460 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782463 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782465 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782469 2578 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782472 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782475 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782477 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782482 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782485 2578 flags.go:64] FLAG: --v="2" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782490 2578 flags.go:64] FLAG: --version="false" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782494 2578 flags.go:64] FLAG: --vmodule="" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782498 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.782501 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782597 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782602 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:45.787919 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782605 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782609 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782611 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782614 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782617 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782620 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782622 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782625 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782627 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782630 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782632 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782635 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782638 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782640 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782643 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782645 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782648 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782651 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782653 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782656 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:45.788495 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782658 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782660 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782663 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782665 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782669 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782672 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782674 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782677 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782680 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782682 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782684 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782690 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782693 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782695 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782698 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782708 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782712 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782715 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782718 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:45.789010 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782720 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782724 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782728 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782730 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782734 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782737 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782740 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782742 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782745 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782747 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782750 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782753 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782755 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782758 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782760 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782763 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782766 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782770 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782773 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782776 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:45.789478 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782778 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782781 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782783 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782786 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782790 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782792 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782795 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782797 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782800 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782815 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782818 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782820 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782823 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782826 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782828 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782831 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782833 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782835 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782838 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782840 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:45.790206 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782843 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:45.791069 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782846 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:45.791069 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782848 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:45.791069 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782851 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:45.791069 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.782854 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:45.791069 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.783592 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:45.792702 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.792580 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:57:45.792736 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.792705 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:57:45.792769 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792757 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:45.792769 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792763 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:45.792769 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792767 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792771 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792774 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792777 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792779 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792782 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792784 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792788 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792790 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792793 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792796 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792798 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792801 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792817 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792821 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792827 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792830 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792834 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792836 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792840 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:45.792903 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792842 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792845 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792848 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792850 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792853 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792856 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792858 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792861 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792864 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792866 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792868 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792871 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792874 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792878 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792881 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792884 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792886 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792889 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792891 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792894 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:45.793422 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792896 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792898 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792901 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792904 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792906 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792909 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792911 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792915 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792918 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792921 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792923 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792926 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792928 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792931 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792933 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792936 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792938 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792941 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792943 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:45.793928 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792946 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792949 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792951 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792954 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792956 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792959 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792962 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792965 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792968 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792970 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792973 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792977 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792981 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792984 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792987 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792990 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792993 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792995 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.792997 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793000 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:45.794394 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793002 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793005 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793007 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793010 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793012 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.793017 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793118 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793124 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793127 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793130 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793133 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793136 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793139 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793141 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793143 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:45.794905 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793146 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793149 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793152 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793155 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793158 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793160 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793163 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793165 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793168 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793170 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793173 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793175 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793178 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793181 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793184 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793187 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793189 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793192 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793194 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793197 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:45.795280 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793199 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793202 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793204 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793207 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793209 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793211 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793214 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793216 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793219 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793221 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793224 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793226 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793229 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793231 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793233 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793237 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793239 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793241 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793244 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793246 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:45.795753 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793249 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793252 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793254 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793257 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793259 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793262 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793265 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793267 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793270 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793272 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793275 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793277 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793280 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793282 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793284 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793287 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793290 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793294 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793297 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:45.796263 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793300 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793303 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793306 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793308 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793311 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793313 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793316 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793318 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793321 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793323 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793326 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793328 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793331 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793334 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793336 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793338 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793341 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:45.796747 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:45.793343 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:45.797318 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.793348 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:45.797318 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.794213 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:57:45.799294 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.799280 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:57:45.800205 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.800193 2578 server.go:1019] "Starting client certificate rotation" Apr 22 19:57:45.800322 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.800304 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:57:45.800382 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.800359 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:57:45.828611 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.828589 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:57:45.830564 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.830544 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:57:45.843945 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.843924 2578 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:57:45.851761 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.851743 2578 log.go:25] "Validated CRI v1 image API" Apr 22 19:57:45.853179 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.853161 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:57:45.859143 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.859097 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b226bb04-b962-4d94-b616-a4e2a1860886:/dev/nvme0n1p3 d44e299a-e143-4474-86ee-3e94d1131eeb:/dev/nvme0n1p4] Apr 22 19:57:45.859236 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.859143 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:57:45.861473 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.861450 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:57:45.865698 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.865584 2578 manager.go:217] Machine: {Timestamp:2026-04-22 19:57:45.863288987 +0000 UTC m=+0.394010987 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096794 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22923844e8b277073115331c07f98e SystemUUID:ec229238-44e8-b277-0731-15331c07f98e BootID:cb1e061c-79da-4d8d-aae4-6ff71ca504f2 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5d:38:33:b6:2b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5d:38:33:b6:2b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ea:cd:2f:b5:69:2e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:57:45.866439 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.866427 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:57:45.866586 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.866573 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:57:45.867824 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.867786 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:57:45.867967 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.867827 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-72.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:57:45.868016 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.867977 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:57:45.868016 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.867986 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:57:45.868016 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.868000 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:57:45.868906 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.868895 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:57:45.870830 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.870820 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:57:45.870940 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.870931 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:57:45.874548 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.874537 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:57:45.874581 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.874552 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:57:45.874581 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.874569 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:57:45.874581 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.874578 2578 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:57:45.874684 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.874587 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:57:45.875901 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.875887 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:57:45.875940 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.875915 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:57:45.879538 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.879520 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:57:45.881442 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.881428 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:57:45.883669 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.883636 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:57:45.883717 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.883678 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:57:45.883717 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.883690 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:57:45.883717 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.883699 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:57:45.883717 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.883713 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:57:45.883843 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.883721 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:57:45.883843 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.883731 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:57:45.883843 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.883739 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:57:45.883843 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.883750 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:57:45.883843 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.883759 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:57:45.883843 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.883772 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:57:45.883843 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.883784 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:57:45.884912 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.884901 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:57:45.884943 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.884915 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:57:45.885725 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:45.885690 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-72.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:57:45.885791 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:45.885729 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:57:45.888686 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.888673 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:57:45.888762 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.888708 2578 server.go:1295] "Started kubelet" Apr 22 19:57:45.888823 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.888784 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:57:45.888864 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.888793 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:57:45.888911 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.888879 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:57:45.889790 ip-10-0-135-72 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:57:45.890503 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.890488 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:57:45.890569 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.890493 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:57:45.896483 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.896468 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-72.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:57:45.897268 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:45.896263 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-72.ec2.internal.18a8c61983ff3dd1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-72.ec2.internal,UID:ip-10-0-135-72.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-72.ec2.internal,},FirstTimestamp:2026-04-22 19:57:45.888685521 +0000 UTC m=+0.419407520,LastTimestamp:2026-04-22 19:57:45.888685521 +0000 UTC m=+0.419407520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-72.ec2.internal,}" Apr 22 19:57:45.900072 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.899925 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:57:45.900072 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:45.899986 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:57:45.900585 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.900555 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:57:45.901669 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.901634 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:57:45.903129 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.902960 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:57:45.903129 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.902986 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:57:45.903129 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:45.903029 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:45.903129 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.903068 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:57:45.903129 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.903082 2578 factory.go:55] Registering systemd factory Apr 22 19:57:45.903129 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.903094 2578 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:57:45.903377 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.903137 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:57:45.903377 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.903146 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:57:45.903607 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.903500 2578 factory.go:153] Registering CRI-O factory Apr 22 19:57:45.903607 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.903517 2578 factory.go:223] Registration of the crio container factory successfully Apr 22 19:57:45.903607 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.903535 2578 factory.go:103] Registering Raw factory Apr 22 19:57:45.903607 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.903545 2578 manager.go:1196] Started watching for new ooms in manager Apr 22 19:57:45.903955 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.903908 2578 manager.go:319] Starting recovery of all containers Apr 22 19:57:45.905039 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:45.905011 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:57:45.905119 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:45.905035 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-72.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:57:45.910677 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.910651 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nfwk7" Apr 22 19:57:45.914006 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.913979 2578 manager.go:324] Recovery completed Apr 22 19:57:45.917801 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.917782 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nfwk7" Apr 22 19:57:45.918069 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.918057 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:45.920760 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.920746 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:45.920826 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.920775 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:45.920826 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.920786 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:45.922032 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.922018 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:57:45.922032 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.922033 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:57:45.922124 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.922048 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:57:45.922662 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:45.922601 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-72.ec2.internal.18a8c61985e8af27 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-72.ec2.internal,UID:ip-10-0-135-72.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-72.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-72.ec2.internal,},FirstTimestamp:2026-04-22 19:57:45.920761639 +0000 UTC m=+0.451483639,LastTimestamp:2026-04-22 19:57:45.920761639 +0000 UTC m=+0.451483639,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-72.ec2.internal,}" Apr 22 19:57:45.924492 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.924481 2578 policy_none.go:49] "None policy: Start" Apr 22 19:57:45.924527 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.924497 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:57:45.924527 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.924509 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:57:45.962481 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.962461 2578 manager.go:341] "Starting Device Plugin manager" Apr 22 19:57:45.962600 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:45.962528 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:57:45.962600 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.962542 2578 server.go:85] "Starting device plugin registration server" Apr 22 19:57:45.963044 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.962844 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:57:45.963044 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.962858 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:57:45.963044 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.962941 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:57:45.963044 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.963019 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:57:45.963044 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:45.963030 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:57:45.963926 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:45.963908 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:57:45.964001 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:45.963942 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:46.040152 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.040078 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:57:46.041465 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.041445 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:57:46.041465 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.041474 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:57:46.041641 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.041493 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:57:46.041641 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.041501 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:57:46.041641 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.041534 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:57:46.047012 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.046992 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:46.063761 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.063739 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:46.064637 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.064619 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:46.064735 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.064655 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:46.064735 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.064671 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:46.064735 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.064700 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.071212 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.071196 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.071305 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.071220 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-72.ec2.internal\": node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:46.087640 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.087614 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:46.142301 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.142264 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-72.ec2.internal"] Apr 22 19:57:46.142433 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.142349 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:46.144124 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.144107 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:46.144202 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.144136 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:46.144202 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.144146 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:46.145314 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.145302 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:46.145468 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.145453 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.145506 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.145483 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:46.146088 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.146051 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:46.146161 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.146092 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:46.146161 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.146102 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:46.146161 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.146070 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:46.146297 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.146166 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:46.146297 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.146184 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:46.147323 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.147308 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.147402 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.147337 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:46.148321 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.148307 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:46.148388 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.148327 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:46.148388 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.148337 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:46.167440 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.167418 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-72.ec2.internal\" not found" node="ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.172085 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.172068 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-72.ec2.internal\" not found" node="ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.188057 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.188032 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:46.205431 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.205404 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bedb542f9001b846bfe76e857d486e03-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal\" (UID: \"bedb542f9001b846bfe76e857d486e03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.205538 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.205436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bedb542f9001b846bfe76e857d486e03-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal\" (UID: \"bedb542f9001b846bfe76e857d486e03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.205538 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.205461 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/761d5be20dc0ed38f4bc469fd088da76-config\") pod \"kube-apiserver-proxy-ip-10-0-135-72.ec2.internal\" (UID: \"761d5be20dc0ed38f4bc469fd088da76\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.288959 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.288925 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:46.306408 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.306350 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bedb542f9001b846bfe76e857d486e03-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal\" (UID: \"bedb542f9001b846bfe76e857d486e03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.306408 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.306380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bedb542f9001b846bfe76e857d486e03-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal\" (UID: \"bedb542f9001b846bfe76e857d486e03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.306508 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.306416 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/761d5be20dc0ed38f4bc469fd088da76-config\") pod \"kube-apiserver-proxy-ip-10-0-135-72.ec2.internal\" (UID: \"761d5be20dc0ed38f4bc469fd088da76\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.306508 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.306475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bedb542f9001b846bfe76e857d486e03-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal\" (UID: \"bedb542f9001b846bfe76e857d486e03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.306508 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.306485 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bedb542f9001b846bfe76e857d486e03-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal\" (UID: \"bedb542f9001b846bfe76e857d486e03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.306508 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.306498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/761d5be20dc0ed38f4bc469fd088da76-config\") pod \"kube-apiserver-proxy-ip-10-0-135-72.ec2.internal\" (UID: \"761d5be20dc0ed38f4bc469fd088da76\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.389838 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.389786 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:46.469411 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.469373 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.474999 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.474977 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-72.ec2.internal" Apr 22 19:57:46.490764 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.490741 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:46.591354 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.591314 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:46.692000 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.691962 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:46.771978 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.771942 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:46.793117 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.793082 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:46.800256 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.800233 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:57:46.800387 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.800371 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:46.800442 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.800405 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:46.869053 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.868983 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:46.893204 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.893160 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:46.900560 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.900538 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:57:46.909236 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.909214 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:57:46.920964 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.920920 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:52:45 +0000 UTC" deadline="2027-11-15 20:45:51.728345613 +0000 UTC" Apr 22 19:57:46.921066 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.920965 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13728h48m4.807384692s" Apr 22 19:57:46.967606 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.967430 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-n8nd7" Apr 22 19:57:46.974286 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:46.974250 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod761d5be20dc0ed38f4bc469fd088da76.slice/crio-c620719ea7d9cc1abdfe5feab71b3bb21f3472192066abea84b14c222f72d312 WatchSource:0}: Error finding container c620719ea7d9cc1abdfe5feab71b3bb21f3472192066abea84b14c222f72d312: Status 404 returned error can't find the container with id c620719ea7d9cc1abdfe5feab71b3bb21f3472192066abea84b14c222f72d312 Apr 22 19:57:46.974674 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:46.974648 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbedb542f9001b846bfe76e857d486e03.slice/crio-b940a0ac5472e22585b349db3e992650acf9aed50c06d728756d0158c894607a WatchSource:0}: Error finding container b940a0ac5472e22585b349db3e992650acf9aed50c06d728756d0158c894607a: Status 404 returned error can't find the container with id b940a0ac5472e22585b349db3e992650acf9aed50c06d728756d0158c894607a Apr 22 19:57:46.976039 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.976018 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-n8nd7" Apr 22 19:57:46.978800 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:46.978785 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:57:46.993290 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:46.993274 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:47.044852 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.044793 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" event={"ID":"bedb542f9001b846bfe76e857d486e03","Type":"ContainerStarted","Data":"b940a0ac5472e22585b349db3e992650acf9aed50c06d728756d0158c894607a"} Apr 22 19:57:47.045673 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.045655 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-72.ec2.internal" event={"ID":"761d5be20dc0ed38f4bc469fd088da76","Type":"ContainerStarted","Data":"c620719ea7d9cc1abdfe5feab71b3bb21f3472192066abea84b14c222f72d312"} Apr 22 19:57:47.093866 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:47.093847 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-72.ec2.internal\" not found" Apr 22 19:57:47.137545 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.137494 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:47.202117 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.202100 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" Apr 22 19:57:47.214899 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.214880 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:57:47.215888 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.215877 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-72.ec2.internal" Apr 22 19:57:47.224564 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.224552 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:57:47.875542 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.875513 2578 apiserver.go:52] "Watching apiserver" Apr 22 19:57:47.885058 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.885034 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:57:47.885759 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.885731 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-m22tm","openshift-multus/network-metrics-daemon-xbxhx","openshift-network-diagnostics/network-check-target-rktp2","kube-system/kube-apiserver-proxy-ip-10-0-135-72.ec2.internal","openshift-cluster-node-tuning-operator/tuned-5x5fq","openshift-network-operator/iptables-alerter-rd9fc","openshift-ovn-kubernetes/ovnkube-node-8ncpz","kube-system/konnectivity-agent-r64fz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg","openshift-image-registry/node-ca-pjssp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal","openshift-multus/multus-additional-cni-plugins-zh622"] Apr 22 19:57:47.889244 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.889222 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.890173 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.890147 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:47.890275 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:47.890221 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:57:47.891368 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.891343 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:57:47.891481 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:47.891458 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:57:47.891636 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.891608 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mflnj\"" Apr 22 19:57:47.891711 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.891664 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:57:47.891711 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.891675 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.891832 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.891618 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.892019 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.892001 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:57:47.892377 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.892352 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.892447 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.892418 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rd9fc" Apr 22 19:57:47.894107 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.894087 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.894445 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.894422 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.894572 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.894556 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.894641 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.894559 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8kl9w\"" Apr 22 19:57:47.894894 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.894875 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.894980 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.894897 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r64fz" Apr 22 19:57:47.894980 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.894874 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.894980 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.894932 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-j7bzm\"" Apr 22 19:57:47.894980 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.894948 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:57:47.896437 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.896423 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:47.897668 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.897562 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:57:47.897668 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.897570 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:57:47.897668 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.897603 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.897668 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.897632 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.897950 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.897849 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:57:47.898262 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.898073 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:57:47.898262 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.898154 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pjssp" Apr 22 19:57:47.898369 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.898276 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:57:47.898369 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.898297 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.898869 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.898570 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-4pxgt\"" Apr 22 19:57:47.898869 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.898608 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:57:47.898869 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.898639 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-f8gcl\"" Apr 22 19:57:47.898869 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.898663 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:57:47.898869 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.898727 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jmxb4\"" Apr 22 19:57:47.899113 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.899045 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.899730 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.899709 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:47.900526 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.900509 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.900743 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.900728 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:57:47.900829 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.900784 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.900890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.900826 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qqmvh\"" Apr 22 19:57:47.901754 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.901732 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cx7zt\"" Apr 22 19:57:47.901851 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.901787 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:57:47.901932 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.901916 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:57:47.904117 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.904102 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:57:47.916437 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916410 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-multus-conf-dir\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.916552 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916452 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-slash\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.916552 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916479 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-etc-selinux\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:47.916552 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916500 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-lib-modules\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.916552 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916524 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-system-cni-dir\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:47.916552 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-etc-kubernetes\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916570 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-cni-binary-copy\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916595 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/16b4200b-7937-4b41-acdc-2d428d40a524-agent-certs\") pod \"konnectivity-agent-r64fz\" (UID: \"16b4200b-7937-4b41-acdc-2d428d40a524\") " pod="kube-system/konnectivity-agent-r64fz" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx24c\" (UniqueName: \"kubernetes.io/projected/a27547a1-f78d-4d2d-8d4f-0816fae61920-kube-api-access-zx24c\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916682 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-modprobe-d\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916710 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-cnibin\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916750 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-hostroot\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916779 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-run-ovn\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916795 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-systemd\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916829 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-sys-fs\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916854 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-kubelet\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916868 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-run-netns\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916891 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8snk\" (UniqueName: \"kubernetes.io/projected/bebc2174-0145-4f91-b0a3-c497f508c693-kube-api-access-n8snk\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.916925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916914 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-kubernetes\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916935 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c893932-7c81-4353-821d-dd67be4edf70-tmp\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.916960 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-cnibin\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917001 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-multus-cni-dir\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64dcc192-4f40-4fa7-bb9c-1dacc5985c26-host-slash\") pod \"iptables-alerter-rd9fc\" (UID: \"64dcc192-4f40-4fa7-bb9c-1dacc5985c26\") " pod="openshift-network-operator/iptables-alerter-rd9fc" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917119 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-run-ovn-kubernetes\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917144 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-var-lib-kubelet\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917165 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-host\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917193 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjr6g\" (UniqueName: \"kubernetes.io/projected/64dcc192-4f40-4fa7-bb9c-1dacc5985c26-kube-api-access-gjr6g\") pod \"iptables-alerter-rd9fc\" (UID: \"64dcc192-4f40-4fa7-bb9c-1dacc5985c26\") " pod="openshift-network-operator/iptables-alerter-rd9fc" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917227 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtqwl\" (UniqueName: \"kubernetes.io/projected/802bd93c-03cf-435c-a223-487ff037f6c7-kube-api-access-qtqwl\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917241 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dkfg\" (UniqueName: \"kubernetes.io/projected/4c893932-7c81-4353-821d-dd67be4edf70-kube-api-access-4dkfg\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917267 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khbg2\" (UniqueName: \"kubernetes.io/projected/8501acc2-dabe-4f52-9b02-ba92e386acb7-kube-api-access-khbg2\") pod \"node-ca-pjssp\" (UID: \"8501acc2-dabe-4f52-9b02-ba92e386acb7\") " pod="openshift-image-registry/node-ca-pjssp" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917290 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-multus-socket-dir-parent\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-run-netns\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.917521 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917334 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-os-release\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917358 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917390 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-var-lib-cni-multus\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917434 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-systemd-units\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917460 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-device-dir\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-os-release\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917510 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-multus-daemon-config\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-log-socket\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917557 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-cni-bin\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917579 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bebc2174-0145-4f91-b0a3-c497f508c693-ovnkube-config\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917599 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-run\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-system-cni-dir\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917656 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtnzd\" (UniqueName: \"kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd\") pod \"network-check-target-rktp2\" (UID: \"0aaf6153-a940-4bb7-9f56-61f82d60b50d\") " pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917680 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64dcc192-4f40-4fa7-bb9c-1dacc5985c26-iptables-alerter-script\") pod \"iptables-alerter-rd9fc\" (UID: \"64dcc192-4f40-4fa7-bb9c-1dacc5985c26\") " pod="openshift-network-operator/iptables-alerter-rd9fc" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-run-systemd\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917724 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-node-log\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.918263 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917748 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bebc2174-0145-4f91-b0a3-c497f508c693-ovnkube-script-lib\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917769 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-sysctl-d\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917863 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgtw2\" (UniqueName: \"kubernetes.io/projected/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-kube-api-access-tgtw2\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917897 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bebc2174-0145-4f91-b0a3-c497f508c693-env-overrides\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917924 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bebc2174-0145-4f91-b0a3-c497f508c693-ovn-node-metrics-cert\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.917959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8501acc2-dabe-4f52-9b02-ba92e386acb7-host\") pod \"node-ca-pjssp\" (UID: \"8501acc2-dabe-4f52-9b02-ba92e386acb7\") " pod="openshift-image-registry/node-ca-pjssp" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918013 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-cni-binary-copy\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918042 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-var-lib-openvswitch\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-etc-openvswitch\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-run-openvswitch\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918111 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/16b4200b-7937-4b41-acdc-2d428d40a524-konnectivity-ca\") pod \"konnectivity-agent-r64fz\" (UID: \"16b4200b-7937-4b41-acdc-2d428d40a524\") " pod="kube-system/konnectivity-agent-r64fz" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918133 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-registration-dir\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918154 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb29t\" (UniqueName: \"kubernetes.io/projected/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-kube-api-access-sb29t\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-run-k8s-cni-cncf-io\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918225 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8501acc2-dabe-4f52-9b02-ba92e386acb7-serviceca\") pod \"node-ca-pjssp\" (UID: \"8501acc2-dabe-4f52-9b02-ba92e386acb7\") " pod="openshift-image-registry/node-ca-pjssp" Apr 22 19:57:47.918890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918267 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-run-multus-certs\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.919397 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918295 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-cni-netd\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.919397 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-socket-dir\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:47.919397 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918341 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:47.919397 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-sysconfig\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.919397 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4c893932-7c81-4353-821d-dd67be4edf70-etc-tuned\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.919397 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918438 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-var-lib-cni-bin\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.919397 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918484 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-var-lib-kubelet\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:47.919397 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918523 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:47.919397 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918558 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-sysctl-conf\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.919397 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.918583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-sys\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:47.977373 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.977325 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:52:46 +0000 UTC" deadline="2027-12-26 20:25:33.957565186 +0000 UTC" Apr 22 19:57:47.977373 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:47.977367 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14712h27m45.980201278s" Apr 22 19:57:48.019234 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/16b4200b-7937-4b41-acdc-2d428d40a524-konnectivity-ca\") pod \"konnectivity-agent-r64fz\" (UID: \"16b4200b-7937-4b41-acdc-2d428d40a524\") " pod="kube-system/konnectivity-agent-r64fz" Apr 22 19:57:48.019405 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-registration-dir\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.019405 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sb29t\" (UniqueName: \"kubernetes.io/projected/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-kube-api-access-sb29t\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.019405 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019312 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-run-k8s-cni-cncf-io\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.019405 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019342 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8501acc2-dabe-4f52-9b02-ba92e386acb7-serviceca\") pod \"node-ca-pjssp\" (UID: \"8501acc2-dabe-4f52-9b02-ba92e386acb7\") " pod="openshift-image-registry/node-ca-pjssp" Apr 22 19:57:48.019405 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019344 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-registration-dir\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.019405 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019360 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-run-multus-certs\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.019405 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-run-multus-certs\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.019405 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019400 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-cni-netd\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-run-k8s-cni-cncf-io\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019432 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-socket-dir\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019436 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-cni-netd\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019479 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-sysconfig\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019502 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4c893932-7c81-4353-821d-dd67be4edf70-etc-tuned\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019535 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-var-lib-cni-bin\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019540 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-socket-dir\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-sysconfig\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-var-lib-kubelet\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019583 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-var-lib-cni-bin\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:48.019621 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-var-lib-kubelet\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019657 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-sysctl-conf\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019668 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019682 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-sys\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.019855 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019704 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-multus-conf-dir\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-slash\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019747 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-etc-selinux\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-lib-modules\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-system-cni-dir\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019821 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-multus-conf-dir\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019827 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8501acc2-dabe-4f52-9b02-ba92e386acb7-serviceca\") pod \"node-ca-pjssp\" (UID: \"8501acc2-dabe-4f52-9b02-ba92e386acb7\") " pod="openshift-image-registry/node-ca-pjssp" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-etc-kubernetes\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019855 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-slash\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019852 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:48.019889 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs podName:802bd93c-03cf-435c-a223-487ff037f6c7 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:48.519866605 +0000 UTC m=+3.050588597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs") pod "network-metrics-daemon-xbxhx" (UID: "802bd93c-03cf-435c-a223-487ff037f6c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019907 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-system-cni-dir\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019911 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/16b4200b-7937-4b41-acdc-2d428d40a524-konnectivity-ca\") pod \"konnectivity-agent-r64fz\" (UID: \"16b4200b-7937-4b41-acdc-2d428d40a524\") " pod="kube-system/konnectivity-agent-r64fz" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019918 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-cni-binary-copy\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019907 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-sys\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019937 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-etc-kubernetes\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019943 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-lib-modules\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.020704 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019935 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-etc-selinux\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019952 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/16b4200b-7937-4b41-acdc-2d428d40a524-agent-certs\") pod \"konnectivity-agent-r64fz\" (UID: \"16b4200b-7937-4b41-acdc-2d428d40a524\") " pod="kube-system/konnectivity-agent-r64fz" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019982 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx24c\" (UniqueName: \"kubernetes.io/projected/a27547a1-f78d-4d2d-8d4f-0816fae61920-kube-api-access-zx24c\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-modprobe-d\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.019799 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-sysctl-conf\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-cnibin\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-cnibin\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020126 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-modprobe-d\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020182 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-hostroot\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-run-ovn\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-systemd\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-hostroot\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020261 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-sys-fs\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-run-ovn\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-systemd\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-kubelet\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-run-netns\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020339 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-sys-fs\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.021740 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020348 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8snk\" (UniqueName: \"kubernetes.io/projected/bebc2174-0145-4f91-b0a3-c497f508c693-kube-api-access-n8snk\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020370 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-kubernetes\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020383 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-run-netns\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020393 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c893932-7c81-4353-821d-dd67be4edf70-tmp\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020418 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-cnibin\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020424 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-cni-binary-copy\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020436 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-kubernetes\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-multus-cni-dir\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64dcc192-4f40-4fa7-bb9c-1dacc5985c26-host-slash\") pod \"iptables-alerter-rd9fc\" (UID: \"64dcc192-4f40-4fa7-bb9c-1dacc5985c26\") " pod="openshift-network-operator/iptables-alerter-rd9fc" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-cnibin\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020520 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-run-ovn-kubernetes\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-var-lib-kubelet\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020570 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-multus-cni-dir\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-host\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjr6g\" (UniqueName: \"kubernetes.io/projected/64dcc192-4f40-4fa7-bb9c-1dacc5985c26-kube-api-access-gjr6g\") pod \"iptables-alerter-rd9fc\" (UID: \"64dcc192-4f40-4fa7-bb9c-1dacc5985c26\") " pod="openshift-network-operator/iptables-alerter-rd9fc" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020825 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.022554 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtqwl\" (UniqueName: \"kubernetes.io/projected/802bd93c-03cf-435c-a223-487ff037f6c7-kube-api-access-qtqwl\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020849 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-kubelet\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021515 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dkfg\" (UniqueName: \"kubernetes.io/projected/4c893932-7c81-4353-821d-dd67be4edf70-kube-api-access-4dkfg\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021549 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khbg2\" (UniqueName: \"kubernetes.io/projected/8501acc2-dabe-4f52-9b02-ba92e386acb7-kube-api-access-khbg2\") pod \"node-ca-pjssp\" (UID: \"8501acc2-dabe-4f52-9b02-ba92e386acb7\") " pod="openshift-image-registry/node-ca-pjssp" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021587 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-multus-socket-dir-parent\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021622 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-run-netns\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-os-release\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021719 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-var-lib-cni-multus\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021746 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-systemd-units\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-device-dir\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-os-release\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-multus-daemon-config\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021897 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-log-socket\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-cni-bin\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021954 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bebc2174-0145-4f91-b0a3-c497f508c693-ovnkube-config\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.023349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021984 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-run\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.022014 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-system-cni-dir\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.022288 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.021548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-var-lib-kubelet\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.022745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-host\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.022860 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-run-netns\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.022927 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-os-release\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.020755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-run-ovn-kubernetes\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023343 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4c893932-7c81-4353-821d-dd67be4edf70-etc-tuned\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023381 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-run\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023533 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-system-cni-dir\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023587 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-log-socket\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023644 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-host-cni-bin\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023701 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64dcc192-4f40-4fa7-bb9c-1dacc5985c26-host-slash\") pod \"iptables-alerter-rd9fc\" (UID: \"64dcc192-4f40-4fa7-bb9c-1dacc5985c26\") " pod="openshift-network-operator/iptables-alerter-rd9fc" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023712 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-multus-daemon-config\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023791 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-host-var-lib-cni-multus\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023877 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-systemd-units\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.024206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnzd\" (UniqueName: \"kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd\") pod \"network-check-target-rktp2\" (UID: \"0aaf6153-a940-4bb7-9f56-61f82d60b50d\") " pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023902 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a27547a1-f78d-4d2d-8d4f-0816fae61920-device-dir\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023924 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64dcc192-4f40-4fa7-bb9c-1dacc5985c26-iptables-alerter-script\") pod \"iptables-alerter-rd9fc\" (UID: \"64dcc192-4f40-4fa7-bb9c-1dacc5985c26\") " pod="openshift-network-operator/iptables-alerter-rd9fc" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023955 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-os-release\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024011 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-run-systemd\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.023959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-run-systemd\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-node-log\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c893932-7c81-4353-821d-dd67be4edf70-tmp\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bebc2174-0145-4f91-b0a3-c497f508c693-ovnkube-script-lib\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024122 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-sysctl-d\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024133 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-node-log\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024150 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgtw2\" (UniqueName: \"kubernetes.io/projected/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-kube-api-access-tgtw2\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024221 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bebc2174-0145-4f91-b0a3-c497f508c693-env-overrides\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bebc2174-0145-4f91-b0a3-c497f508c693-ovn-node-metrics-cert\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024283 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8501acc2-dabe-4f52-9b02-ba92e386acb7-host\") pod \"node-ca-pjssp\" (UID: \"8501acc2-dabe-4f52-9b02-ba92e386acb7\") " pod="openshift-image-registry/node-ca-pjssp" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024311 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-cni-binary-copy\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-var-lib-openvswitch\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024376 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-etc-openvswitch\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-run-openvswitch\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024449 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64dcc192-4f40-4fa7-bb9c-1dacc5985c26-iptables-alerter-script\") pod \"iptables-alerter-rd9fc\" (UID: \"64dcc192-4f40-4fa7-bb9c-1dacc5985c26\") " pod="openshift-network-operator/iptables-alerter-rd9fc" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024524 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-etc-openvswitch\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024583 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-var-lib-openvswitch\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024579 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024690 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bebc2174-0145-4f91-b0a3-c497f508c693-ovnkube-config\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024452 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/16b4200b-7937-4b41-acdc-2d428d40a524-agent-certs\") pod \"konnectivity-agent-r64fz\" (UID: \"16b4200b-7937-4b41-acdc-2d428d40a524\") " pod="kube-system/konnectivity-agent-r64fz" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024850 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bebc2174-0145-4f91-b0a3-c497f508c693-env-overrides\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4c893932-7c81-4353-821d-dd67be4edf70-etc-sysctl-d\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.024957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bebc2174-0145-4f91-b0a3-c497f508c693-run-openvswitch\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.025234 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8501acc2-dabe-4f52-9b02-ba92e386acb7-host\") pod \"node-ca-pjssp\" (UID: \"8501acc2-dabe-4f52-9b02-ba92e386acb7\") " pod="openshift-image-registry/node-ca-pjssp" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.025260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-cni-binary-copy\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.022825 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-multus-socket-dir-parent\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.025930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.025380 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bebc2174-0145-4f91-b0a3-c497f508c693-ovnkube-script-lib\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.027292 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.027271 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bebc2174-0145-4f91-b0a3-c497f508c693-ovn-node-metrics-cert\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.030928 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.030902 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb29t\" (UniqueName: \"kubernetes.io/projected/b7a6e97e-64c6-44de-9b0f-622a7b3a2316-kube-api-access-sb29t\") pod \"multus-additional-cni-plugins-zh622\" (UID: \"b7a6e97e-64c6-44de-9b0f-622a7b3a2316\") " pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.031145 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.031115 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx24c\" (UniqueName: \"kubernetes.io/projected/a27547a1-f78d-4d2d-8d4f-0816fae61920-kube-api-access-zx24c\") pod \"aws-ebs-csi-driver-node-mn8xg\" (UID: \"a27547a1-f78d-4d2d-8d4f-0816fae61920\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.033405 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:48.033382 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:48.033405 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:48.033405 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:48.033561 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:48.033418 2578 projected.go:194] Error preparing data for projected volume kube-api-access-wtnzd for pod openshift-network-diagnostics/network-check-target-rktp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:48.033561 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:48.033483 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd podName:0aaf6153-a940-4bb7-9f56-61f82d60b50d nodeName:}" failed. No retries permitted until 2026-04-22 19:57:48.533466014 +0000 UTC m=+3.064188021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wtnzd" (UniqueName: "kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd") pod "network-check-target-rktp2" (UID: "0aaf6153-a940-4bb7-9f56-61f82d60b50d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:48.035341 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.035314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8snk\" (UniqueName: \"kubernetes.io/projected/bebc2174-0145-4f91-b0a3-c497f508c693-kube-api-access-n8snk\") pod \"ovnkube-node-8ncpz\" (UID: \"bebc2174-0145-4f91-b0a3-c497f508c693\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.035985 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.035964 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjr6g\" (UniqueName: \"kubernetes.io/projected/64dcc192-4f40-4fa7-bb9c-1dacc5985c26-kube-api-access-gjr6g\") pod \"iptables-alerter-rd9fc\" (UID: \"64dcc192-4f40-4fa7-bb9c-1dacc5985c26\") " pod="openshift-network-operator/iptables-alerter-rd9fc" Apr 22 19:57:48.036353 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.036333 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgtw2\" (UniqueName: \"kubernetes.io/projected/9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab-kube-api-access-tgtw2\") pod \"multus-m22tm\" (UID: \"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab\") " pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.036564 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.036417 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtqwl\" (UniqueName: \"kubernetes.io/projected/802bd93c-03cf-435c-a223-487ff037f6c7-kube-api-access-qtqwl\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:48.036834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.036794 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khbg2\" (UniqueName: \"kubernetes.io/projected/8501acc2-dabe-4f52-9b02-ba92e386acb7-kube-api-access-khbg2\") pod \"node-ca-pjssp\" (UID: \"8501acc2-dabe-4f52-9b02-ba92e386acb7\") " pod="openshift-image-registry/node-ca-pjssp" Apr 22 19:57:48.038075 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.038056 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dkfg\" (UniqueName: \"kubernetes.io/projected/4c893932-7c81-4353-821d-dd67be4edf70-kube-api-access-4dkfg\") pod \"tuned-5x5fq\" (UID: \"4c893932-7c81-4353-821d-dd67be4edf70\") " pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.139078 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.138980 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:48.202783 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.202749 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-m22tm" Apr 22 19:57:48.208680 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.208659 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" Apr 22 19:57:48.217153 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.217132 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rd9fc" Apr 22 19:57:48.223776 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.223754 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:57:48.229456 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.229438 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-r64fz" Apr 22 19:57:48.235993 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.235975 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" Apr 22 19:57:48.243502 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.243485 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pjssp" Apr 22 19:57:48.249014 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.249000 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zh622" Apr 22 19:57:48.526957 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.526864 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:48.527137 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:48.527034 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:48.527137 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:48.527108 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs podName:802bd93c-03cf-435c-a223-487ff037f6c7 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:49.527084967 +0000 UTC m=+4.057806962 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs") pod "network-metrics-daemon-xbxhx" (UID: "802bd93c-03cf-435c-a223-487ff037f6c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:48.579162 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:48.579034 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64dcc192_4f40_4fa7_bb9c_1dacc5985c26.slice/crio-634591f4a2510b6829cebfe662181f399b6dad317fad3d3b34a5bdc7b755c180 WatchSource:0}: Error finding container 634591f4a2510b6829cebfe662181f399b6dad317fad3d3b34a5bdc7b755c180: Status 404 returned error can't find the container with id 634591f4a2510b6829cebfe662181f399b6dad317fad3d3b34a5bdc7b755c180 Apr 22 19:57:48.580550 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:48.580514 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c8b3da8_9112_4c9d_abf4_97a17bb2a3ab.slice/crio-9d9803c3278afa9ced04735e6c434bc4a40993221a483a58106e0d8bd287e648 WatchSource:0}: Error finding container 9d9803c3278afa9ced04735e6c434bc4a40993221a483a58106e0d8bd287e648: Status 404 returned error can't find the container with id 9d9803c3278afa9ced04735e6c434bc4a40993221a483a58106e0d8bd287e648 Apr 22 19:57:48.581539 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:48.581516 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbebc2174_0145_4f91_b0a3_c497f508c693.slice/crio-c9747690070c66213ba94c6a73804c731bcd66ece72283cb2574d70c26ec8bba WatchSource:0}: Error finding container c9747690070c66213ba94c6a73804c731bcd66ece72283cb2574d70c26ec8bba: Status 404 returned error can't find the container with id c9747690070c66213ba94c6a73804c731bcd66ece72283cb2574d70c26ec8bba Apr 22 19:57:48.583077 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:48.582743 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7a6e97e_64c6_44de_9b0f_622a7b3a2316.slice/crio-54c5ef46bbbbd59ef2a0bec070d86bb70220f15f03d3cba5b34e28f2f6c4d62c WatchSource:0}: Error finding container 54c5ef46bbbbd59ef2a0bec070d86bb70220f15f03d3cba5b34e28f2f6c4d62c: Status 404 returned error can't find the container with id 54c5ef46bbbbd59ef2a0bec070d86bb70220f15f03d3cba5b34e28f2f6c4d62c Apr 22 19:57:48.584163 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:48.584123 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda27547a1_f78d_4d2d_8d4f_0816fae61920.slice/crio-656596ab7f6c60a22e412a489231c4b222581f175b3db2dee0fcb6978bc8a15e WatchSource:0}: Error finding container 656596ab7f6c60a22e412a489231c4b222581f175b3db2dee0fcb6978bc8a15e: Status 404 returned error can't find the container with id 656596ab7f6c60a22e412a489231c4b222581f175b3db2dee0fcb6978bc8a15e Apr 22 19:57:48.588144 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:48.588119 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c893932_7c81_4353_821d_dd67be4edf70.slice/crio-0f582718ef7c8d81635582e4318e4b8dff4cdd9d6dfde8649e522b76d386e8ea WatchSource:0}: Error finding container 0f582718ef7c8d81635582e4318e4b8dff4cdd9d6dfde8649e522b76d386e8ea: Status 404 returned error can't find the container with id 0f582718ef7c8d81635582e4318e4b8dff4cdd9d6dfde8649e522b76d386e8ea Apr 22 19:57:48.588731 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:48.588694 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8501acc2_dabe_4f52_9b02_ba92e386acb7.slice/crio-163cd750efaa42c354876ea4c2d994ba891c5a688a723c7a9bccc77faa7d77d3 WatchSource:0}: Error finding container 163cd750efaa42c354876ea4c2d994ba891c5a688a723c7a9bccc77faa7d77d3: Status 404 returned error can't find the container with id 163cd750efaa42c354876ea4c2d994ba891c5a688a723c7a9bccc77faa7d77d3 Apr 22 19:57:48.589927 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:57:48.589860 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16b4200b_7937_4b41_acdc_2d428d40a524.slice/crio-467227ee03479905d5fa69bff6c2e602774ae5d9d5302af588ab75f6d26f3420 WatchSource:0}: Error finding container 467227ee03479905d5fa69bff6c2e602774ae5d9d5302af588ab75f6d26f3420: Status 404 returned error can't find the container with id 467227ee03479905d5fa69bff6c2e602774ae5d9d5302af588ab75f6d26f3420 Apr 22 19:57:48.627613 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.627589 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnzd\" (UniqueName: \"kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd\") pod \"network-check-target-rktp2\" (UID: \"0aaf6153-a940-4bb7-9f56-61f82d60b50d\") " pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:57:48.627755 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:48.627734 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:48.627755 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:48.627757 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:48.627897 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:48.627770 2578 projected.go:194] Error preparing data for projected volume kube-api-access-wtnzd for pod openshift-network-diagnostics/network-check-target-rktp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:48.627897 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:48.627837 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd podName:0aaf6153-a940-4bb7-9f56-61f82d60b50d nodeName:}" failed. No retries permitted until 2026-04-22 19:57:49.627802894 +0000 UTC m=+4.158524881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wtnzd" (UniqueName: "kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd") pod "network-check-target-rktp2" (UID: "0aaf6153-a940-4bb7-9f56-61f82d60b50d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:48.978438 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.978395 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:52:46 +0000 UTC" deadline="2027-12-20 03:39:31.628432676 +0000 UTC" Apr 22 19:57:48.978438 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:48.978434 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14551h41m42.650002608s" Apr 22 19:57:49.042224 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.042188 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:49.042391 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:49.042334 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:57:49.065772 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.065734 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-72.ec2.internal" event={"ID":"761d5be20dc0ed38f4bc469fd088da76","Type":"ContainerStarted","Data":"d847f8f633049461b004538dd7414ad66c73e9366c5b23f9e5ee97638915d737"} Apr 22 19:57:49.086729 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.086659 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pjssp" event={"ID":"8501acc2-dabe-4f52-9b02-ba92e386acb7","Type":"ContainerStarted","Data":"163cd750efaa42c354876ea4c2d994ba891c5a688a723c7a9bccc77faa7d77d3"} Apr 22 19:57:49.092638 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.092557 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r64fz" event={"ID":"16b4200b-7937-4b41-acdc-2d428d40a524","Type":"ContainerStarted","Data":"467227ee03479905d5fa69bff6c2e602774ae5d9d5302af588ab75f6d26f3420"} Apr 22 19:57:49.102546 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.102511 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" event={"ID":"4c893932-7c81-4353-821d-dd67be4edf70","Type":"ContainerStarted","Data":"0f582718ef7c8d81635582e4318e4b8dff4cdd9d6dfde8649e522b76d386e8ea"} Apr 22 19:57:49.105636 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.105605 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zh622" event={"ID":"b7a6e97e-64c6-44de-9b0f-622a7b3a2316","Type":"ContainerStarted","Data":"54c5ef46bbbbd59ef2a0bec070d86bb70220f15f03d3cba5b34e28f2f6c4d62c"} Apr 22 19:57:49.110112 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.110082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rd9fc" event={"ID":"64dcc192-4f40-4fa7-bb9c-1dacc5985c26","Type":"ContainerStarted","Data":"634591f4a2510b6829cebfe662181f399b6dad317fad3d3b34a5bdc7b755c180"} Apr 22 19:57:49.112110 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.112059 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" event={"ID":"bebc2174-0145-4f91-b0a3-c497f508c693","Type":"ContainerStarted","Data":"c9747690070c66213ba94c6a73804c731bcd66ece72283cb2574d70c26ec8bba"} Apr 22 19:57:49.113501 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.113461 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" event={"ID":"a27547a1-f78d-4d2d-8d4f-0816fae61920","Type":"ContainerStarted","Data":"656596ab7f6c60a22e412a489231c4b222581f175b3db2dee0fcb6978bc8a15e"} Apr 22 19:57:49.116564 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.116540 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m22tm" event={"ID":"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab","Type":"ContainerStarted","Data":"9d9803c3278afa9ced04735e6c434bc4a40993221a483a58106e0d8bd287e648"} Apr 22 19:57:49.535232 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.535194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:49.535401 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:49.535347 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:49.535488 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:49.535408 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs podName:802bd93c-03cf-435c-a223-487ff037f6c7 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:51.535390586 +0000 UTC m=+6.066112574 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs") pod "network-metrics-daemon-xbxhx" (UID: "802bd93c-03cf-435c-a223-487ff037f6c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:49.636077 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.636034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnzd\" (UniqueName: \"kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd\") pod \"network-check-target-rktp2\" (UID: \"0aaf6153-a940-4bb7-9f56-61f82d60b50d\") " pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:57:49.636352 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:49.636333 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:49.636438 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:49.636359 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:49.636438 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:49.636372 2578 projected.go:194] Error preparing data for projected volume kube-api-access-wtnzd for pod openshift-network-diagnostics/network-check-target-rktp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:49.636438 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:49.636429 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd podName:0aaf6153-a940-4bb7-9f56-61f82d60b50d nodeName:}" failed. No retries permitted until 2026-04-22 19:57:51.636410664 +0000 UTC m=+6.167132664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wtnzd" (UniqueName: "kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd") pod "network-check-target-rktp2" (UID: "0aaf6153-a940-4bb7-9f56-61f82d60b50d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:49.699059 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:49.699012 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:50.048108 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:50.048069 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:57:50.048554 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:50.048273 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:57:50.132317 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:50.132233 2578 generic.go:358] "Generic (PLEG): container finished" podID="bedb542f9001b846bfe76e857d486e03" containerID="11989ecbcba47f2410abcaa434232b441e0ca9a8ba9bec9c838db1f5e9e073be" exitCode=0 Apr 22 19:57:50.132898 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:50.132871 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" event={"ID":"bedb542f9001b846bfe76e857d486e03","Type":"ContainerDied","Data":"11989ecbcba47f2410abcaa434232b441e0ca9a8ba9bec9c838db1f5e9e073be"} Apr 22 19:57:50.147300 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:50.147246 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-72.ec2.internal" podStartSLOduration=3.147228933 podStartE2EDuration="3.147228933s" podCreationTimestamp="2026-04-22 19:57:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:57:49.080963092 +0000 UTC m=+3.611685101" watchObservedRunningTime="2026-04-22 19:57:50.147228933 +0000 UTC m=+4.677950943" Apr 22 19:57:51.042509 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:51.042474 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:51.042710 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:51.042631 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:57:51.138155 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:51.138113 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" event={"ID":"bedb542f9001b846bfe76e857d486e03","Type":"ContainerStarted","Data":"96a05fdcf68579cbe8b7681c6e717d0ee4625b87539a228b5dc90fd9f83e31c5"} Apr 22 19:57:51.150345 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:51.150291 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-72.ec2.internal" podStartSLOduration=4.150272469 podStartE2EDuration="4.150272469s" podCreationTimestamp="2026-04-22 19:57:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:57:51.149927098 +0000 UTC m=+5.680649109" watchObservedRunningTime="2026-04-22 19:57:51.150272469 +0000 UTC m=+5.680994481" Apr 22 19:57:51.554349 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:51.553799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:51.554349 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:51.553926 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:51.554349 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:51.554008 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs podName:802bd93c-03cf-435c-a223-487ff037f6c7 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:55.553982058 +0000 UTC m=+10.084704047 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs") pod "network-metrics-daemon-xbxhx" (UID: "802bd93c-03cf-435c-a223-487ff037f6c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:51.654435 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:51.654403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnzd\" (UniqueName: \"kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd\") pod \"network-check-target-rktp2\" (UID: \"0aaf6153-a940-4bb7-9f56-61f82d60b50d\") " pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:57:51.654592 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:51.654563 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:51.654592 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:51.654590 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:51.654712 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:51.654603 2578 projected.go:194] Error preparing data for projected volume kube-api-access-wtnzd for pod openshift-network-diagnostics/network-check-target-rktp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:51.654712 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:51.654670 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd podName:0aaf6153-a940-4bb7-9f56-61f82d60b50d nodeName:}" failed. No retries permitted until 2026-04-22 19:57:55.654653036 +0000 UTC m=+10.185375029 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wtnzd" (UniqueName: "kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd") pod "network-check-target-rktp2" (UID: "0aaf6153-a940-4bb7-9f56-61f82d60b50d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:52.044958 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:52.044859 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:57:52.045131 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:52.044987 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:57:53.042511 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:53.042476 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:53.043016 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:53.042627 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:57:54.042713 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:54.042679 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:57:54.043166 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:54.042832 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:57:55.041687 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:55.041655 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:55.041940 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:55.041799 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:57:55.583463 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:55.583421 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:55.583926 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:55.583561 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:55.583926 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:55.583617 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs podName:802bd93c-03cf-435c-a223-487ff037f6c7 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:03.583602614 +0000 UTC m=+18.114324601 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs") pod "network-metrics-daemon-xbxhx" (UID: "802bd93c-03cf-435c-a223-487ff037f6c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:55.683917 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:55.683820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnzd\" (UniqueName: \"kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd\") pod \"network-check-target-rktp2\" (UID: \"0aaf6153-a940-4bb7-9f56-61f82d60b50d\") " pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:57:55.684106 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:55.683982 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:55.684106 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:55.684009 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:55.684106 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:55.684022 2578 projected.go:194] Error preparing data for projected volume kube-api-access-wtnzd for pod openshift-network-diagnostics/network-check-target-rktp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:55.684106 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:55.684090 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd podName:0aaf6153-a940-4bb7-9f56-61f82d60b50d nodeName:}" failed. No retries permitted until 2026-04-22 19:58:03.684070631 +0000 UTC m=+18.214792633 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wtnzd" (UniqueName: "kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd") pod "network-check-target-rktp2" (UID: "0aaf6153-a940-4bb7-9f56-61f82d60b50d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:56.043164 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:56.043131 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:57:56.043355 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:56.043274 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:57:56.473897 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:56.473862 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xpbrx"] Apr 22 19:57:56.476743 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:56.476722 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:57:56.476895 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:56.476797 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xpbrx" podUID="88e758f9-14ca-4081-b67d-e9de91d6ddf6" Apr 22 19:57:56.589793 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:56.589752 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/88e758f9-14ca-4081-b67d-e9de91d6ddf6-kubelet-config\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:57:56.590230 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:56.589923 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/88e758f9-14ca-4081-b67d-e9de91d6ddf6-dbus\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:57:56.590230 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:56.589968 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:57:56.691288 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:56.691262 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/88e758f9-14ca-4081-b67d-e9de91d6ddf6-kubelet-config\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:57:56.691456 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:56.691339 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/88e758f9-14ca-4081-b67d-e9de91d6ddf6-dbus\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:57:56.691456 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:56.691368 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:57:56.691572 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:56.691504 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:56.691572 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:56.691564 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret podName:88e758f9-14ca-4081-b67d-e9de91d6ddf6 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:57.191545771 +0000 UTC m=+11.722267765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret") pod "global-pull-secret-syncer-xpbrx" (UID: "88e758f9-14ca-4081-b67d-e9de91d6ddf6") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:56.691842 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:56.691821 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/88e758f9-14ca-4081-b67d-e9de91d6ddf6-kubelet-config\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:57:56.691925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:56.691895 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/88e758f9-14ca-4081-b67d-e9de91d6ddf6-dbus\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:57:57.042725 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:57.042694 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:57.042932 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:57.042837 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:57:57.194445 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:57.194406 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:57:57.194641 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:57.194614 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:57.194721 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:57.194684 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret podName:88e758f9-14ca-4081-b67d-e9de91d6ddf6 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:58.194664963 +0000 UTC m=+12.725386959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret") pod "global-pull-secret-syncer-xpbrx" (UID: "88e758f9-14ca-4081-b67d-e9de91d6ddf6") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:58.042356 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:58.042322 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:57:58.042869 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:58.042373 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:57:58.042869 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:58.042450 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:57:58.042869 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:58.042532 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xpbrx" podUID="88e758f9-14ca-4081-b67d-e9de91d6ddf6" Apr 22 19:57:58.201375 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:58.201333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:57:58.201539 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:58.201509 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:58.201596 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:58.201570 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret podName:88e758f9-14ca-4081-b67d-e9de91d6ddf6 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:00.201553338 +0000 UTC m=+14.732275325 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret") pod "global-pull-secret-syncer-xpbrx" (UID: "88e758f9-14ca-4081-b67d-e9de91d6ddf6") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:59.042406 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:57:59.042375 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:57:59.042829 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:57:59.042493 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:58:00.042048 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:00.042015 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:00.042250 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:00.042063 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:00.042250 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:00.042139 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:58:00.042549 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:00.042526 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xpbrx" podUID="88e758f9-14ca-4081-b67d-e9de91d6ddf6" Apr 22 19:58:00.215014 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:00.214977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:00.215191 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:00.215105 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:00.215191 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:00.215164 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret podName:88e758f9-14ca-4081-b67d-e9de91d6ddf6 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:04.215150136 +0000 UTC m=+18.745872123 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret") pod "global-pull-secret-syncer-xpbrx" (UID: "88e758f9-14ca-4081-b67d-e9de91d6ddf6") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:01.042681 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:01.042650 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:01.043043 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:01.042762 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:58:02.042192 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:02.042157 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:02.042354 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:02.042157 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:02.042354 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:02.042283 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:58:02.042354 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:02.042333 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xpbrx" podUID="88e758f9-14ca-4081-b67d-e9de91d6ddf6" Apr 22 19:58:02.825326 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:02.825297 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-qzz8z"] Apr 22 19:58:02.843618 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:02.843579 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qzz8z" Apr 22 19:58:02.850179 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:02.849964 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:58:02.850179 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:02.850079 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-gxmxw\"" Apr 22 19:58:02.850179 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:02.850089 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:58:02.933626 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:02.933593 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz2vz\" (UniqueName: \"kubernetes.io/projected/f4ad43cf-a292-44ff-a1ae-9d139860c9cc-kube-api-access-lz2vz\") pod \"node-resolver-qzz8z\" (UID: \"f4ad43cf-a292-44ff-a1ae-9d139860c9cc\") " pod="openshift-dns/node-resolver-qzz8z" Apr 22 19:58:02.933802 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:02.933639 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f4ad43cf-a292-44ff-a1ae-9d139860c9cc-tmp-dir\") pod \"node-resolver-qzz8z\" (UID: \"f4ad43cf-a292-44ff-a1ae-9d139860c9cc\") " pod="openshift-dns/node-resolver-qzz8z" Apr 22 19:58:02.933802 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:02.933788 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f4ad43cf-a292-44ff-a1ae-9d139860c9cc-hosts-file\") pod \"node-resolver-qzz8z\" (UID: \"f4ad43cf-a292-44ff-a1ae-9d139860c9cc\") " pod="openshift-dns/node-resolver-qzz8z" Apr 22 19:58:03.034629 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:03.034589 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f4ad43cf-a292-44ff-a1ae-9d139860c9cc-tmp-dir\") pod \"node-resolver-qzz8z\" (UID: \"f4ad43cf-a292-44ff-a1ae-9d139860c9cc\") " pod="openshift-dns/node-resolver-qzz8z" Apr 22 19:58:03.034819 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:03.034681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f4ad43cf-a292-44ff-a1ae-9d139860c9cc-hosts-file\") pod \"node-resolver-qzz8z\" (UID: \"f4ad43cf-a292-44ff-a1ae-9d139860c9cc\") " pod="openshift-dns/node-resolver-qzz8z" Apr 22 19:58:03.034819 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:03.034716 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lz2vz\" (UniqueName: \"kubernetes.io/projected/f4ad43cf-a292-44ff-a1ae-9d139860c9cc-kube-api-access-lz2vz\") pod \"node-resolver-qzz8z\" (UID: \"f4ad43cf-a292-44ff-a1ae-9d139860c9cc\") " pod="openshift-dns/node-resolver-qzz8z" Apr 22 19:58:03.034895 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:03.034798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f4ad43cf-a292-44ff-a1ae-9d139860c9cc-hosts-file\") pod \"node-resolver-qzz8z\" (UID: \"f4ad43cf-a292-44ff-a1ae-9d139860c9cc\") " pod="openshift-dns/node-resolver-qzz8z" Apr 22 19:58:03.042104 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:03.042084 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:03.042222 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:03.042180 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:58:03.046595 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:03.046573 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f4ad43cf-a292-44ff-a1ae-9d139860c9cc-tmp-dir\") pod \"node-resolver-qzz8z\" (UID: \"f4ad43cf-a292-44ff-a1ae-9d139860c9cc\") " pod="openshift-dns/node-resolver-qzz8z" Apr 22 19:58:03.050412 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:03.050387 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz2vz\" (UniqueName: \"kubernetes.io/projected/f4ad43cf-a292-44ff-a1ae-9d139860c9cc-kube-api-access-lz2vz\") pod \"node-resolver-qzz8z\" (UID: \"f4ad43cf-a292-44ff-a1ae-9d139860c9cc\") " pod="openshift-dns/node-resolver-qzz8z" Apr 22 19:58:03.153347 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:03.153314 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qzz8z" Apr 22 19:58:03.639246 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:03.639214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:03.639397 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:03.639360 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:03.639433 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:03.639411 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs podName:802bd93c-03cf-435c-a223-487ff037f6c7 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.63939736 +0000 UTC m=+34.170119347 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs") pod "network-metrics-daemon-xbxhx" (UID: "802bd93c-03cf-435c-a223-487ff037f6c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:03.739850 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:03.739824 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnzd\" (UniqueName: \"kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd\") pod \"network-check-target-rktp2\" (UID: \"0aaf6153-a940-4bb7-9f56-61f82d60b50d\") " pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:03.740016 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:03.739980 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:03.740016 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:03.740000 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:03.740016 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:03.740009 2578 projected.go:194] Error preparing data for projected volume kube-api-access-wtnzd for pod openshift-network-diagnostics/network-check-target-rktp2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:03.740156 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:03.740062 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd podName:0aaf6153-a940-4bb7-9f56-61f82d60b50d nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.74004445 +0000 UTC m=+34.270766440 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wtnzd" (UniqueName: "kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd") pod "network-check-target-rktp2" (UID: "0aaf6153-a940-4bb7-9f56-61f82d60b50d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:04.043118 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:04.042549 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:04.043118 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:04.042704 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:58:04.043118 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:04.042774 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:04.043118 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:04.042900 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xpbrx" podUID="88e758f9-14ca-4081-b67d-e9de91d6ddf6" Apr 22 19:58:04.243960 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:04.243919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:04.244129 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:04.244102 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:04.244201 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:04.244168 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret podName:88e758f9-14ca-4081-b67d-e9de91d6ddf6 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:12.244152726 +0000 UTC m=+26.774874729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret") pod "global-pull-secret-syncer-xpbrx" (UID: "88e758f9-14ca-4081-b67d-e9de91d6ddf6") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:05.042085 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:05.042048 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:05.042384 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:05.042186 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:58:06.043487 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.043345 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:06.044291 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:06.043546 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:58:06.044291 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.043439 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:06.044291 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:06.043925 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xpbrx" podUID="88e758f9-14ca-4081-b67d-e9de91d6ddf6" Apr 22 19:58:06.167225 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.167191 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" event={"ID":"bebc2174-0145-4f91-b0a3-c497f508c693","Type":"ContainerStarted","Data":"b2f587f2130ec60fc3b61e6008957da7231e2d6240c54c49829c789ba31f10b0"} Apr 22 19:58:06.167331 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.167238 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" event={"ID":"bebc2174-0145-4f91-b0a3-c497f508c693","Type":"ContainerStarted","Data":"4c603971c80efc6e76ced92eeee8a8be882d548e94693815567274e046fcabbf"} Apr 22 19:58:06.168622 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.168570 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" event={"ID":"a27547a1-f78d-4d2d-8d4f-0816fae61920","Type":"ContainerStarted","Data":"8c1232c61922e9fbca7b0d8ec7c0ce9584bbb0ce70df99b72a46bf23b4a74cd6"} Apr 22 19:58:06.169986 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.169949 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-m22tm" event={"ID":"9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab","Type":"ContainerStarted","Data":"e9abb30c1a6f4f36f2f82778ad56b31187d909840fe3169d7c5300118b63aee9"} Apr 22 19:58:06.172615 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.172588 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qzz8z" event={"ID":"f4ad43cf-a292-44ff-a1ae-9d139860c9cc","Type":"ContainerStarted","Data":"b465d29ffe1cf99d81ba13fa0d2cac56739405bbb142476fd5a0bc22ae2d5acf"} Apr 22 19:58:06.172725 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.172639 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qzz8z" event={"ID":"f4ad43cf-a292-44ff-a1ae-9d139860c9cc","Type":"ContainerStarted","Data":"76eead85731f10518ed6eb0eb98976e433f114650fae207e0ec5fc2babda2ca9"} Apr 22 19:58:06.174550 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.174528 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pjssp" event={"ID":"8501acc2-dabe-4f52-9b02-ba92e386acb7","Type":"ContainerStarted","Data":"6907ca5ab3d23cf716ac94bf341ff1f1d516182f6b5a78ed2cf7e78a7bc1a6dd"} Apr 22 19:58:06.176104 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.176081 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-r64fz" event={"ID":"16b4200b-7937-4b41-acdc-2d428d40a524","Type":"ContainerStarted","Data":"6267cc1f5f7f0a6182d18007a8eea9d51f4609c099250f38caf5d188f67f2aa4"} Apr 22 19:58:06.178309 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.178276 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" event={"ID":"4c893932-7c81-4353-821d-dd67be4edf70","Type":"ContainerStarted","Data":"7c1f51ee544461f4306507026249d9e9b986461b66ebf758fad27c77bbea367d"} Apr 22 19:58:06.179734 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.179710 2578 generic.go:358] "Generic (PLEG): container finished" podID="b7a6e97e-64c6-44de-9b0f-622a7b3a2316" containerID="5ddc75ed6da2ef7be35ba7ffffce4f7e2154125f1cec05242cd8de817f8932e8" exitCode=0 Apr 22 19:58:06.179848 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.179747 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zh622" event={"ID":"b7a6e97e-64c6-44de-9b0f-622a7b3a2316","Type":"ContainerDied","Data":"5ddc75ed6da2ef7be35ba7ffffce4f7e2154125f1cec05242cd8de817f8932e8"} Apr 22 19:58:06.188251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.188207 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-m22tm" podStartSLOduration=3.291053371 podStartE2EDuration="20.188196526s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="2026-04-22 19:57:48.582175872 +0000 UTC m=+3.112897858" lastFinishedPulling="2026-04-22 19:58:05.479319026 +0000 UTC m=+20.010041013" observedRunningTime="2026-04-22 19:58:06.18804934 +0000 UTC m=+20.718771360" watchObservedRunningTime="2026-04-22 19:58:06.188196526 +0000 UTC m=+20.718918534" Apr 22 19:58:06.214294 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.214249 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qzz8z" podStartSLOduration=4.214233772 podStartE2EDuration="4.214233772s" podCreationTimestamp="2026-04-22 19:58:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:06.213244827 +0000 UTC m=+20.743966837" watchObservedRunningTime="2026-04-22 19:58:06.214233772 +0000 UTC m=+20.744955780" Apr 22 19:58:06.226879 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.226835 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5x5fq" podStartSLOduration=3.371968156 podStartE2EDuration="20.226797617s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="2026-04-22 19:57:48.5900911 +0000 UTC m=+3.120813088" lastFinishedPulling="2026-04-22 19:58:05.444920548 +0000 UTC m=+19.975642549" observedRunningTime="2026-04-22 19:58:06.226041703 +0000 UTC m=+20.756763712" watchObservedRunningTime="2026-04-22 19:58:06.226797617 +0000 UTC m=+20.757519626" Apr 22 19:58:06.239200 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.238768 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-r64fz" podStartSLOduration=3.385061746 podStartE2EDuration="20.238752007s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="2026-04-22 19:57:48.591243226 +0000 UTC m=+3.121965213" lastFinishedPulling="2026-04-22 19:58:05.444933488 +0000 UTC m=+19.975655474" observedRunningTime="2026-04-22 19:58:06.238421225 +0000 UTC m=+20.769143237" watchObservedRunningTime="2026-04-22 19:58:06.238752007 +0000 UTC m=+20.769474017" Apr 22 19:58:06.250506 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.250425 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pjssp" podStartSLOduration=3.39641294 podStartE2EDuration="20.250412072s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="2026-04-22 19:57:48.590943367 +0000 UTC m=+3.121665358" lastFinishedPulling="2026-04-22 19:58:05.444942502 +0000 UTC m=+19.975664490" observedRunningTime="2026-04-22 19:58:06.249785188 +0000 UTC m=+20.780507197" watchObservedRunningTime="2026-04-22 19:58:06.250412072 +0000 UTC m=+20.781134093" Apr 22 19:58:06.911336 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.911149 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:58:06.983332 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.983225 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:58:06.911326737Z","UUID":"1263b494-b895-4045-bf1b-d30195f8d629","Handler":null,"Name":"","Endpoint":""} Apr 22 19:58:06.984918 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.984896 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:58:06.984918 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:06.984924 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:58:07.041948 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:07.041869 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:07.042093 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:07.042006 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:58:07.183596 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:07.183559 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rd9fc" event={"ID":"64dcc192-4f40-4fa7-bb9c-1dacc5985c26","Type":"ContainerStarted","Data":"991e7e65b5ff28b9aed7ce8764cfa61e727099dcb6104e54cf5633825dada038"} Apr 22 19:58:07.186337 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:07.186309 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" event={"ID":"bebc2174-0145-4f91-b0a3-c497f508c693","Type":"ContainerStarted","Data":"d649ad38b949f63484c84711077e66e3e02499294db3e167dbe3671ed9c3ab28"} Apr 22 19:58:07.186470 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:07.186346 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" event={"ID":"bebc2174-0145-4f91-b0a3-c497f508c693","Type":"ContainerStarted","Data":"78657e91cc05be76690fa7ad957ca76c094c97e29bb1d1285d6cf9d8125a8748"} Apr 22 19:58:07.186470 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:07.186359 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" event={"ID":"bebc2174-0145-4f91-b0a3-c497f508c693","Type":"ContainerStarted","Data":"458ab467b7d5380a85b451badb59a5e45fb048577273cc5d48bbe12266759656"} Apr 22 19:58:07.186470 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:07.186371 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" event={"ID":"bebc2174-0145-4f91-b0a3-c497f508c693","Type":"ContainerStarted","Data":"00dc0adc5d37dc258f1e82fb68860103e1f17d789c6f0627569b0004bc5daf74"} Apr 22 19:58:07.188296 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:07.188226 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" event={"ID":"a27547a1-f78d-4d2d-8d4f-0816fae61920","Type":"ContainerStarted","Data":"c59e0028725782db6d4302939c0e0ab9f843d300a6e165ee4fa5809a24781a3b"} Apr 22 19:58:07.195765 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:07.195721 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rd9fc" podStartSLOduration=4.332061326 podStartE2EDuration="21.195706487s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="2026-04-22 19:57:48.581295876 +0000 UTC m=+3.112017863" lastFinishedPulling="2026-04-22 19:58:05.444941031 +0000 UTC m=+19.975663024" observedRunningTime="2026-04-22 19:58:07.19555314 +0000 UTC m=+21.726275152" watchObservedRunningTime="2026-04-22 19:58:07.195706487 +0000 UTC m=+21.726428481" Apr 22 19:58:08.042471 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:08.042434 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:08.042655 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:08.042574 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xpbrx" podUID="88e758f9-14ca-4081-b67d-e9de91d6ddf6" Apr 22 19:58:08.042655 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:08.042643 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:08.042777 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:08.042738 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:58:08.192026 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:08.191984 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" event={"ID":"a27547a1-f78d-4d2d-8d4f-0816fae61920","Type":"ContainerStarted","Data":"d844712649b49b37f3e876b75211ac52db4b85ebfc757d7ff03cc5ff76b2edda"} Apr 22 19:58:08.222468 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:08.222407 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mn8xg" podStartSLOduration=3.004381086 podStartE2EDuration="22.222389251s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="2026-04-22 19:57:48.587950303 +0000 UTC m=+3.118672297" lastFinishedPulling="2026-04-22 19:58:07.805958472 +0000 UTC m=+22.336680462" observedRunningTime="2026-04-22 19:58:08.221997761 +0000 UTC m=+22.752719770" watchObservedRunningTime="2026-04-22 19:58:08.222389251 +0000 UTC m=+22.753111262" Apr 22 19:58:08.550659 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:08.550567 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-r64fz" Apr 22 19:58:08.551282 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:08.551255 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-r64fz" Apr 22 19:58:09.041939 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:09.041910 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:09.042098 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:09.042034 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:58:09.197341 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:09.197309 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" event={"ID":"bebc2174-0145-4f91-b0a3-c497f508c693","Type":"ContainerStarted","Data":"f970db27b9896cc986c9f0750028691cb5366487a195a6fd77cc00335ddb7a51"} Apr 22 19:58:09.197923 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:09.197537 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-r64fz" Apr 22 19:58:09.198238 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:09.198222 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-r64fz" Apr 22 19:58:10.042267 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:10.042238 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:10.042424 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:10.042367 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xpbrx" podUID="88e758f9-14ca-4081-b67d-e9de91d6ddf6" Apr 22 19:58:10.042488 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:10.042442 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:10.042569 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:10.042546 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:58:11.041842 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:11.041653 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:11.042362 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:11.041946 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:58:11.202614 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:11.202579 2578 generic.go:358] "Generic (PLEG): container finished" podID="b7a6e97e-64c6-44de-9b0f-622a7b3a2316" containerID="0b1a8eed7d894ea491a5d68ba9b340f304a04d903af5bd0f5fc875c41d90683f" exitCode=0 Apr 22 19:58:11.202789 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:11.202675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zh622" event={"ID":"b7a6e97e-64c6-44de-9b0f-622a7b3a2316","Type":"ContainerDied","Data":"0b1a8eed7d894ea491a5d68ba9b340f304a04d903af5bd0f5fc875c41d90683f"} Apr 22 19:58:11.205962 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:11.205909 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" event={"ID":"bebc2174-0145-4f91-b0a3-c497f508c693","Type":"ContainerStarted","Data":"a8279726cea2c420bf2d6f9d2d6a90efb043d7013512a480cf7414445e53c267"} Apr 22 19:58:11.206471 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:11.206449 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:58:11.206471 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:11.206480 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:58:11.224392 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:11.222252 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:58:11.248563 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:11.248471 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" podStartSLOduration=8.014284562 podStartE2EDuration="25.248456659s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="2026-04-22 19:57:48.587623447 +0000 UTC m=+3.118345440" lastFinishedPulling="2026-04-22 19:58:05.821795549 +0000 UTC m=+20.352517537" observedRunningTime="2026-04-22 19:58:11.247940667 +0000 UTC m=+25.778662676" watchObservedRunningTime="2026-04-22 19:58:11.248456659 +0000 UTC m=+25.779178667" Apr 22 19:58:12.041972 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.041936 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:12.042411 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:12.042068 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:58:12.042480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.041938 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:12.042568 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:12.042537 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xpbrx" podUID="88e758f9-14ca-4081-b67d-e9de91d6ddf6" Apr 22 19:58:12.209741 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.209706 2578 generic.go:358] "Generic (PLEG): container finished" podID="b7a6e97e-64c6-44de-9b0f-622a7b3a2316" containerID="21409b641556d45f7ed292eb64e52cd636f7d52aa319c0a749d645ca2d3ae1d6" exitCode=0 Apr 22 19:58:12.209916 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.209785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zh622" event={"ID":"b7a6e97e-64c6-44de-9b0f-622a7b3a2316","Type":"ContainerDied","Data":"21409b641556d45f7ed292eb64e52cd636f7d52aa319c0a749d645ca2d3ae1d6"} Apr 22 19:58:12.210498 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.210402 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:58:12.226145 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.226117 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:58:12.305485 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.305244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:12.305640 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:12.305325 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:12.305640 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:12.305581 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret podName:88e758f9-14ca-4081-b67d-e9de91d6ddf6 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:28.305559543 +0000 UTC m=+42.836281545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret") pod "global-pull-secret-syncer-xpbrx" (UID: "88e758f9-14ca-4081-b67d-e9de91d6ddf6") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:12.812552 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.811956 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xpbrx"] Apr 22 19:58:12.812552 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.812104 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:12.812552 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:12.812209 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xpbrx" podUID="88e758f9-14ca-4081-b67d-e9de91d6ddf6" Apr 22 19:58:12.814206 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.814175 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rktp2"] Apr 22 19:58:12.814339 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.814303 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:12.814415 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:12.814395 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:58:12.814980 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.814944 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xbxhx"] Apr 22 19:58:12.815091 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:12.815055 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:12.815193 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:12.815133 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:58:13.213636 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:13.213596 2578 generic.go:358] "Generic (PLEG): container finished" podID="b7a6e97e-64c6-44de-9b0f-622a7b3a2316" containerID="074834ffffc4399c85f35ceded3cfd33eb0cc3caf8acc4bbfcaf6dd9e8f2e11d" exitCode=0 Apr 22 19:58:13.214255 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:13.213692 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zh622" event={"ID":"b7a6e97e-64c6-44de-9b0f-622a7b3a2316","Type":"ContainerDied","Data":"074834ffffc4399c85f35ceded3cfd33eb0cc3caf8acc4bbfcaf6dd9e8f2e11d"} Apr 22 19:58:14.042787 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:14.042753 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:14.042998 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:14.042913 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:58:15.041682 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:15.041655 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:15.042140 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:15.041652 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:15.042140 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:15.041825 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:58:15.042140 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:15.041854 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xpbrx" podUID="88e758f9-14ca-4081-b67d-e9de91d6ddf6" Apr 22 19:58:16.043350 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:16.043311 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:16.044171 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:16.043421 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:58:17.042122 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:17.042078 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:17.042297 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:17.042078 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:17.042379 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:17.042351 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbxhx" podUID="802bd93c-03cf-435c-a223-487ff037f6c7" Apr 22 19:58:17.042379 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:17.042210 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xpbrx" podUID="88e758f9-14ca-4081-b67d-e9de91d6ddf6" Apr 22 19:58:18.047001 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.046966 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:18.047509 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.047083 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rktp2" podUID="0aaf6153-a940-4bb7-9f56-61f82d60b50d" Apr 22 19:58:18.274119 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.273883 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-72.ec2.internal" event="NodeReady" Apr 22 19:58:18.274295 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.274227 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:58:18.312312 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.312245 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9"] Apr 22 19:58:18.335215 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.335181 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz"] Apr 22 19:58:18.335377 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.335327 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.338775 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.338738 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:58:18.339977 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.338988 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:58:18.340122 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.339514 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:58:18.340272 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.340253 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-md5kj\"" Apr 22 19:58:18.347071 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.347045 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:58:18.347632 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.347254 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-lzbxz"] Apr 22 19:58:18.347632 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.347390 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:18.350114 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.350094 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bbzb4\"" Apr 22 19:58:18.350331 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.350315 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 19:58:18.350516 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.350502 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:18.350784 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.350749 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:18.359642 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.359620 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9"] Apr 22 19:58:18.359642 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.359643 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-lzbxz"] Apr 22 19:58:18.359835 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.359655 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rgmkt"] Apr 22 19:58:18.359938 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.359801 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.363709 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.363671 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 19:58:18.363917 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.363734 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:18.363917 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.363745 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 19:58:18.363917 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.363836 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:18.364166 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.363675 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-d6ns7\"" Apr 22 19:58:18.375890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.373995 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4"] Apr 22 19:58:18.375890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.374133 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:18.375890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.375543 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 19:58:18.378834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.377365 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:58:18.380754 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.380736 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9r99g\"" Apr 22 19:58:18.380979 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.380964 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:58:18.390935 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.390903 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8"] Apr 22 19:58:18.391078 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.391042 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:18.393827 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.393792 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nfggw\"" Apr 22 19:58:18.393931 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.393795 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 19:58:18.394989 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.394969 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:58:18.394989 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.394986 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 19:58:18.395183 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.395167 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:58:18.403483 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.403383 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz"] Apr 22 19:58:18.403483 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.403412 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-nbb5m"] Apr 22 19:58:18.403648 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.403540 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" Apr 22 19:58:18.405934 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.405908 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:18.406032 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.405966 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 19:58:18.406103 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.405915 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-q4px4\"" Apr 22 19:58:18.406202 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.405922 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 19:58:18.406316 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.406271 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:18.416259 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.416239 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98"] Apr 22 19:58:18.416397 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.416379 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.419412 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.419261 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 19:58:18.419412 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.419347 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:58:18.419528 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.419428 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-mt4f2\"" Apr 22 19:58:18.419975 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.419958 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:58:18.420129 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.420108 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 19:58:18.425217 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.425197 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-748d997cd4-p2hd6"] Apr 22 19:58:18.425326 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.425313 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 19:58:18.425707 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.425689 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" Apr 22 19:58:18.428266 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.428239 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 19:58:18.428947 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.428927 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:18.429036 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.429014 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 19:58:18.429036 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.429028 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-csw46\"" Apr 22 19:58:18.430487 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.430470 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:18.440178 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.440160 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2"] Apr 22 19:58:18.440300 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.440285 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.442969 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.442950 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:58:18.443255 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.443236 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 19:58:18.443465 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.443448 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 19:58:18.443762 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.443740 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 19:58:18.443883 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.443774 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:58:18.443883 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.443779 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 19:58:18.443993 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.443884 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-pwnl8\"" Apr 22 19:58:18.451688 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.451666 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10902c6d-77cd-4e80-86e7-28633566a0ee-serving-cert\") pod \"console-operator-9d4b6777b-lzbxz\" (UID: \"10902c6d-77cd-4e80-86e7-28633566a0ee\") " pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.451833 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.451735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10902c6d-77cd-4e80-86e7-28633566a0ee-trusted-ca\") pod \"console-operator-9d4b6777b-lzbxz\" (UID: \"10902c6d-77cd-4e80-86e7-28633566a0ee\") " pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.451833 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.451782 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-certificates\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.451945 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.451849 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-trusted-ca\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.451945 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.451879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b42faf8-dfc9-477e-a74b-abcef44beb8e-config-volume\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:18.451945 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.451911 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nv4q\" (UniqueName: \"kubernetes.io/projected/10902c6d-77cd-4e80-86e7-28633566a0ee-kube-api-access-5nv4q\") pod \"console-operator-9d4b6777b-lzbxz\" (UID: \"10902c6d-77cd-4e80-86e7-28633566a0ee\") " pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.452134 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.451957 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-installation-pull-secrets\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.452134 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.451993 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10902c6d-77cd-4e80-86e7-28633566a0ee-config\") pod \"console-operator-9d4b6777b-lzbxz\" (UID: \"10902c6d-77cd-4e80-86e7-28633566a0ee\") " pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.452134 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.452029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6vcjz\" (UID: \"1ac0aac7-46c9-42f6-8aaa-e626360e1faa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:18.452134 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.452057 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jljg\" (UniqueName: \"kubernetes.io/projected/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-kube-api-access-5jljg\") pod \"cluster-samples-operator-6dc5bdb6b4-6vcjz\" (UID: \"1ac0aac7-46c9-42f6-8aaa-e626360e1faa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:18.452134 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.452096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-image-registry-private-configuration\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.452389 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.452167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-ca-trust-extracted\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.452927 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.452674 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:18.452927 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.452718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-bound-sa-token\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.452927 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.452740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.452927 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.452763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3b42faf8-dfc9-477e-a74b-abcef44beb8e-tmp-dir\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:18.452927 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.452839 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-476gp\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-kube-api-access-476gp\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.452927 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.452858 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc49w\" (UniqueName: \"kubernetes.io/projected/3b42faf8-dfc9-477e-a74b-abcef44beb8e-kube-api-access-bc49w\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:18.454523 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.454504 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8zwhn"] Apr 22 19:58:18.454703 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.454687 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:18.456890 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.456868 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-t4wtg\"" Apr 22 19:58:18.457041 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.457029 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 19:58:18.457191 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.457177 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 19:58:18.466251 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.466227 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-f7ckn"] Apr 22 19:58:18.466393 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.466369 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:18.468583 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.468562 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:58:18.468682 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.468622 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xt4d8\"" Apr 22 19:58:18.468682 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.468572 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:58:18.468796 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.468783 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:58:18.481870 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.481840 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zmn8h"] Apr 22 19:58:18.481996 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.481980 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7ckn" Apr 22 19:58:18.484118 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.484095 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:58:18.484244 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.484146 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-dxf4p\"" Apr 22 19:58:18.484244 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.484156 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:58:18.498527 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.498497 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4"] Apr 22 19:58:18.498527 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.498533 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8"] Apr 22 19:58:18.498713 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.498563 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zmn8h" Apr 22 19:58:18.498713 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.498595 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-nbb5m"] Apr 22 19:58:18.498713 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.498611 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rgmkt"] Apr 22 19:58:18.498713 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.498678 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2"] Apr 22 19:58:18.498713 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.498696 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-f7ckn"] Apr 22 19:58:18.498713 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.498709 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8zwhn"] Apr 22 19:58:18.498961 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.498723 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98"] Apr 22 19:58:18.498961 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.498734 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-748d997cd4-p2hd6"] Apr 22 19:58:18.498961 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.498744 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zmn8h"] Apr 22 19:58:18.500757 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.500736 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:18.500868 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.500843 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-25d4m\"" Apr 22 19:58:18.500868 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.500855 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:18.553172 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-bound-sa-token\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.553172 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-stats-auth\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.553409 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.553409 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3b42faf8-dfc9-477e-a74b-abcef44beb8e-tmp-dir\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:18.553409 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553263 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtj2d\" (UniqueName: \"kubernetes.io/projected/99eb7f40-e81e-4454-b333-f70327da668c-kube-api-access-wtj2d\") pod \"kube-storage-version-migrator-operator-6769c5d45-hdqk8\" (UID: \"99eb7f40-e81e-4454-b333-f70327da668c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" Apr 22 19:58:18.553409 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553284 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-476gp\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-kube-api-access-476gp\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.553409 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553314 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/e05b401e-86d3-4f58-ba83-8727ba2b2682-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:18.553409 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c79df87a-c734-4d84-b239-5cfbd4266788-tmp\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.553409 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553361 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c79df87a-c734-4d84-b239-5cfbd4266788-snapshots\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.553409 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553376 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f0a675ce-7b40-4a85-8869-d492c9e0218d-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5lsr2\" (UID: \"f0a675ce-7b40-4a85-8869-d492c9e0218d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:18.553409 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c79df87a-c734-4d84-b239-5cfbd4266788-serving-cert\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b42faf8-dfc9-477e-a74b-abcef44beb8e-config-volume\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553449 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nv4q\" (UniqueName: \"kubernetes.io/projected/10902c6d-77cd-4e80-86e7-28633566a0ee-kube-api-access-5nv4q\") pod \"console-operator-9d4b6777b-lzbxz\" (UID: \"10902c6d-77cd-4e80-86e7-28633566a0ee\") " pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-installation-pull-secrets\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10902c6d-77cd-4e80-86e7-28633566a0ee-config\") pod \"console-operator-9d4b6777b-lzbxz\" (UID: \"10902c6d-77cd-4e80-86e7-28633566a0ee\") " pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553540 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553556 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5lsr2\" (UID: \"f0a675ce-7b40-4a85-8869-d492c9e0218d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert\") pod \"ingress-canary-8zwhn\" (UID: \"54257b88-b9cb-44e6-9885-78eb59be8c12\") " pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6vcjz\" (UID: \"1ac0aac7-46c9-42f6-8aaa-e626360e1faa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553627 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8qk6\" (UniqueName: \"kubernetes.io/projected/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-kube-api-access-c8qk6\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-image-registry-private-configuration\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10902c6d-77cd-4e80-86e7-28633566a0ee-serving-cert\") pod \"console-operator-9d4b6777b-lzbxz\" (UID: \"10902c6d-77cd-4e80-86e7-28633566a0ee\") " pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4crxl\" (UniqueName: \"kubernetes.io/projected/e05b401e-86d3-4f58-ba83-8727ba2b2682-kube-api-access-4crxl\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553738 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-ca-trust-extracted\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.553834 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553758 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdh5l\" (UniqueName: \"kubernetes.io/projected/54257b88-b9cb-44e6-9885-78eb59be8c12-kube-api-access-zdh5l\") pod \"ingress-canary-8zwhn\" (UID: \"54257b88-b9cb-44e6-9885-78eb59be8c12\") " pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-default-certificate\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553832 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553878 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc49w\" (UniqueName: \"kubernetes.io/projected/3b42faf8-dfc9-477e-a74b-abcef44beb8e-kube-api-access-bc49w\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fncd\" (UniqueName: \"kubernetes.io/projected/b589acde-4c45-4c44-be29-d3785bec1ccd-kube-api-access-8fncd\") pod \"service-ca-operator-d6fc45fc5-lgx98\" (UID: \"b589acde-4c45-4c44-be29-d3785bec1ccd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10902c6d-77cd-4e80-86e7-28633566a0ee-trusted-ca\") pod \"console-operator-9d4b6777b-lzbxz\" (UID: \"10902c6d-77cd-4e80-86e7-28633566a0ee\") " pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99eb7f40-e81e-4454-b333-f70327da668c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-hdqk8\" (UID: \"99eb7f40-e81e-4454-b333-f70327da668c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.553988 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-certificates\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.554015 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79df87a-c734-4d84-b239-5cfbd4266788-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.554049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b589acde-4c45-4c44-be29-d3785bec1ccd-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lgx98\" (UID: \"b589acde-4c45-4c44-be29-d3785bec1ccd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.554076 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b589acde-4c45-4c44-be29-d3785bec1ccd-config\") pod \"service-ca-operator-d6fc45fc5-lgx98\" (UID: \"b589acde-4c45-4c44-be29-d3785bec1ccd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.554105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmc48\" (UniqueName: \"kubernetes.io/projected/b0d5f54b-81d2-4910-b1b2-87b6f74fa261-kube-api-access-mmc48\") pod \"network-check-source-8894fc9bd-f7ckn\" (UID: \"b0d5f54b-81d2-4910-b1b2-87b6f74fa261\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7ckn" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.554135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-trusted-ca\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.554160 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99eb7f40-e81e-4454-b333-f70327da668c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-hdqk8\" (UID: \"99eb7f40-e81e-4454-b333-f70327da668c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.554188 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jj6x\" (UniqueName: \"kubernetes.io/projected/c79df87a-c734-4d84-b239-5cfbd4266788-kube-api-access-4jj6x\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.554480 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.554215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jljg\" (UniqueName: \"kubernetes.io/projected/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-kube-api-access-5jljg\") pod \"cluster-samples-operator-6dc5bdb6b4-6vcjz\" (UID: \"1ac0aac7-46c9-42f6-8aaa-e626360e1faa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:18.555174 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.554254 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:18.555174 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.554280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79df87a-c734-4d84-b239-5cfbd4266788-service-ca-bundle\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.555174 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.554609 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:18.555174 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.554622 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9: secret "image-registry-tls" not found Apr 22 19:58:18.555174 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.554710 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls podName:e9ed2f86-635a-4b6a-bb4c-a1b309537b91 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.054690809 +0000 UTC m=+33.585412799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls") pod "image-registry-6bdfcbd6fd-q8mx9" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91") : secret "image-registry-tls" not found Apr 22 19:58:18.555388 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.555197 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3b42faf8-dfc9-477e-a74b-abcef44beb8e-tmp-dir\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:18.555998 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.555969 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:58:18.556099 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.556042 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls podName:1ac0aac7-46c9-42f6-8aaa-e626360e1faa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.056024612 +0000 UTC m=+33.586746609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6vcjz" (UID: "1ac0aac7-46c9-42f6-8aaa-e626360e1faa") : secret "samples-operator-tls" not found Apr 22 19:58:18.556173 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.556105 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:18.556173 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.556165 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls podName:3b42faf8-dfc9-477e-a74b-abcef44beb8e nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.056151031 +0000 UTC m=+33.586873028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls") pod "dns-default-rgmkt" (UID: "3b42faf8-dfc9-477e-a74b-abcef44beb8e") : secret "dns-default-metrics-tls" not found Apr 22 19:58:18.556550 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.556491 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-ca-trust-extracted\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.556774 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.556704 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10902c6d-77cd-4e80-86e7-28633566a0ee-config\") pod \"console-operator-9d4b6777b-lzbxz\" (UID: \"10902c6d-77cd-4e80-86e7-28633566a0ee\") " pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.556774 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.556745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-certificates\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.556925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.556829 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10902c6d-77cd-4e80-86e7-28633566a0ee-trusted-ca\") pod \"console-operator-9d4b6777b-lzbxz\" (UID: \"10902c6d-77cd-4e80-86e7-28633566a0ee\") " pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.556925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.556903 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-trusted-ca\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.557596 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.557577 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b42faf8-dfc9-477e-a74b-abcef44beb8e-config-volume\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:18.561710 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.561667 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-image-registry-private-configuration\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.561710 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.561685 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-installation-pull-secrets\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.564532 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.564464 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-bound-sa-token\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.565420 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.565397 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nv4q\" (UniqueName: \"kubernetes.io/projected/10902c6d-77cd-4e80-86e7-28633566a0ee-kube-api-access-5nv4q\") pod \"console-operator-9d4b6777b-lzbxz\" (UID: \"10902c6d-77cd-4e80-86e7-28633566a0ee\") " pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.565690 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.565668 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc49w\" (UniqueName: \"kubernetes.io/projected/3b42faf8-dfc9-477e-a74b-abcef44beb8e-kube-api-access-bc49w\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:18.566318 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.566296 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-476gp\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-kube-api-access-476gp\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:18.571454 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.571433 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jljg\" (UniqueName: \"kubernetes.io/projected/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-kube-api-access-5jljg\") pod \"cluster-samples-operator-6dc5bdb6b4-6vcjz\" (UID: \"1ac0aac7-46c9-42f6-8aaa-e626360e1faa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:18.576120 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.576080 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10902c6d-77cd-4e80-86e7-28633566a0ee-serving-cert\") pod \"console-operator-9d4b6777b-lzbxz\" (UID: \"10902c6d-77cd-4e80-86e7-28633566a0ee\") " pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.655499 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655463 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79df87a-c734-4d84-b239-5cfbd4266788-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.655499 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b589acde-4c45-4c44-be29-d3785bec1ccd-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lgx98\" (UID: \"b589acde-4c45-4c44-be29-d3785bec1ccd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" Apr 22 19:58:18.655710 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655529 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b589acde-4c45-4c44-be29-d3785bec1ccd-config\") pod \"service-ca-operator-d6fc45fc5-lgx98\" (UID: \"b589acde-4c45-4c44-be29-d3785bec1ccd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" Apr 22 19:58:18.655710 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmc48\" (UniqueName: \"kubernetes.io/projected/b0d5f54b-81d2-4910-b1b2-87b6f74fa261-kube-api-access-mmc48\") pod \"network-check-source-8894fc9bd-f7ckn\" (UID: \"b0d5f54b-81d2-4910-b1b2-87b6f74fa261\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7ckn" Apr 22 19:58:18.655710 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99eb7f40-e81e-4454-b333-f70327da668c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-hdqk8\" (UID: \"99eb7f40-e81e-4454-b333-f70327da668c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" Apr 22 19:58:18.655710 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jj6x\" (UniqueName: \"kubernetes.io/projected/c79df87a-c734-4d84-b239-5cfbd4266788-kube-api-access-4jj6x\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.655885 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655711 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79df87a-c734-4d84-b239-5cfbd4266788-service-ca-bundle\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.655885 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655739 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbq7v\" (UniqueName: \"kubernetes.io/projected/93e25588-349f-4ceb-bdeb-d27b0e71e171-kube-api-access-bbq7v\") pod \"volume-data-source-validator-7c6cbb6c87-zmn8h\" (UID: \"93e25588-349f-4ceb-bdeb-d27b0e71e171\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zmn8h" Apr 22 19:58:18.655885 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-stats-auth\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.655885 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtj2d\" (UniqueName: \"kubernetes.io/projected/99eb7f40-e81e-4454-b333-f70327da668c-kube-api-access-wtj2d\") pod \"kube-storage-version-migrator-operator-6769c5d45-hdqk8\" (UID: \"99eb7f40-e81e-4454-b333-f70327da668c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" Apr 22 19:58:18.655885 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/e05b401e-86d3-4f58-ba83-8727ba2b2682-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:18.656069 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c79df87a-c734-4d84-b239-5cfbd4266788-tmp\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.656069 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655925 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c79df87a-c734-4d84-b239-5cfbd4266788-snapshots\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.656069 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f0a675ce-7b40-4a85-8869-d492c9e0218d-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5lsr2\" (UID: \"f0a675ce-7b40-4a85-8869-d492c9e0218d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:18.656069 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.655974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c79df87a-c734-4d84-b239-5cfbd4266788-serving-cert\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.656069 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656018 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.656069 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.656338 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656071 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5lsr2\" (UID: \"f0a675ce-7b40-4a85-8869-d492c9e0218d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:18.656338 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656094 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert\") pod \"ingress-canary-8zwhn\" (UID: \"54257b88-b9cb-44e6-9885-78eb59be8c12\") " pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:18.656338 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8qk6\" (UniqueName: \"kubernetes.io/projected/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-kube-api-access-c8qk6\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.656338 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656191 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4crxl\" (UniqueName: \"kubernetes.io/projected/e05b401e-86d3-4f58-ba83-8727ba2b2682-kube-api-access-4crxl\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:18.656338 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656222 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdh5l\" (UniqueName: \"kubernetes.io/projected/54257b88-b9cb-44e6-9885-78eb59be8c12-kube-api-access-zdh5l\") pod \"ingress-canary-8zwhn\" (UID: \"54257b88-b9cb-44e6-9885-78eb59be8c12\") " pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:18.656338 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656252 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-default-certificate\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.656338 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:18.656338 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656278 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b589acde-4c45-4c44-be29-d3785bec1ccd-config\") pod \"service-ca-operator-d6fc45fc5-lgx98\" (UID: \"b589acde-4c45-4c44-be29-d3785bec1ccd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" Apr 22 19:58:18.656338 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fncd\" (UniqueName: \"kubernetes.io/projected/b589acde-4c45-4c44-be29-d3785bec1ccd-kube-api-access-8fncd\") pod \"service-ca-operator-d6fc45fc5-lgx98\" (UID: \"b589acde-4c45-4c44-be29-d3785bec1ccd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" Apr 22 19:58:18.656696 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79df87a-c734-4d84-b239-5cfbd4266788-service-ca-bundle\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.656696 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.656506 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79df87a-c734-4d84-b239-5cfbd4266788-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.656696 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.656600 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle podName:cd91dcdf-472d-4758-b11b-7e7b6d347fbd nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.156582193 +0000 UTC m=+33.687304184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle") pod "router-default-748d997cd4-p2hd6" (UID: "cd91dcdf-472d-4758-b11b-7e7b6d347fbd") : configmap references non-existent config key: service-ca.crt Apr 22 19:58:18.657005 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.656982 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:58:18.657115 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.657043 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs podName:cd91dcdf-472d-4758-b11b-7e7b6d347fbd nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.157026134 +0000 UTC m=+33.687748125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs") pod "router-default-748d997cd4-p2hd6" (UID: "cd91dcdf-472d-4758-b11b-7e7b6d347fbd") : secret "router-metrics-certs-default" not found Apr 22 19:58:18.657115 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.657102 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:18.657227 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.657126 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c79df87a-c734-4d84-b239-5cfbd4266788-snapshots\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.657227 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.657139 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert podName:f0a675ce-7b40-4a85-8869-d492c9e0218d nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.157128307 +0000 UTC m=+33.687850295 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5lsr2" (UID: "f0a675ce-7b40-4a85-8869-d492c9e0218d") : secret "networking-console-plugin-cert" not found Apr 22 19:58:18.657227 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.657204 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:18.657429 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.657236 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert podName:54257b88-b9cb-44e6-9885-78eb59be8c12 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.157225433 +0000 UTC m=+33.687947443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert") pod "ingress-canary-8zwhn" (UID: "54257b88-b9cb-44e6-9885-78eb59be8c12") : secret "canary-serving-cert" not found Apr 22 19:58:18.657429 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.657376 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f0a675ce-7b40-4a85-8869-d492c9e0218d-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5lsr2\" (UID: \"f0a675ce-7b40-4a85-8869-d492c9e0218d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:18.657429 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.657439 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99eb7f40-e81e-4454-b333-f70327da668c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-hdqk8\" (UID: \"99eb7f40-e81e-4454-b333-f70327da668c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" Apr 22 19:58:18.657836 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.657793 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:18.657926 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:18.657889 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls podName:e05b401e-86d3-4f58-ba83-8727ba2b2682 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.157870952 +0000 UTC m=+33.688592974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fvkt4" (UID: "e05b401e-86d3-4f58-ba83-8727ba2b2682") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:18.658024 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.658002 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99eb7f40-e81e-4454-b333-f70327da668c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-hdqk8\" (UID: \"99eb7f40-e81e-4454-b333-f70327da668c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" Apr 22 19:58:18.658185 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.658138 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/e05b401e-86d3-4f58-ba83-8727ba2b2682-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:18.658308 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.658252 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c79df87a-c734-4d84-b239-5cfbd4266788-tmp\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.658632 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.658607 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b589acde-4c45-4c44-be29-d3785bec1ccd-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lgx98\" (UID: \"b589acde-4c45-4c44-be29-d3785bec1ccd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" Apr 22 19:58:18.659421 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.659399 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99eb7f40-e81e-4454-b333-f70327da668c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-hdqk8\" (UID: \"99eb7f40-e81e-4454-b333-f70327da668c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" Apr 22 19:58:18.659666 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.659651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c79df87a-c734-4d84-b239-5cfbd4266788-serving-cert\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.659756 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.659735 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-default-certificate\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.660323 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.660287 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-stats-auth\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.664147 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.664111 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmc48\" (UniqueName: \"kubernetes.io/projected/b0d5f54b-81d2-4910-b1b2-87b6f74fa261-kube-api-access-mmc48\") pod \"network-check-source-8894fc9bd-f7ckn\" (UID: \"b0d5f54b-81d2-4910-b1b2-87b6f74fa261\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7ckn" Apr 22 19:58:18.668981 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.665305 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jj6x\" (UniqueName: \"kubernetes.io/projected/c79df87a-c734-4d84-b239-5cfbd4266788-kube-api-access-4jj6x\") pod \"insights-operator-585dfdc468-nbb5m\" (UID: \"c79df87a-c734-4d84-b239-5cfbd4266788\") " pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.668981 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.665440 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8qk6\" (UniqueName: \"kubernetes.io/projected/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-kube-api-access-c8qk6\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:18.668981 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.667148 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdh5l\" (UniqueName: \"kubernetes.io/projected/54257b88-b9cb-44e6-9885-78eb59be8c12-kube-api-access-zdh5l\") pod \"ingress-canary-8zwhn\" (UID: \"54257b88-b9cb-44e6-9885-78eb59be8c12\") " pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:18.668981 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.667220 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtj2d\" (UniqueName: \"kubernetes.io/projected/99eb7f40-e81e-4454-b333-f70327da668c-kube-api-access-wtj2d\") pod \"kube-storage-version-migrator-operator-6769c5d45-hdqk8\" (UID: \"99eb7f40-e81e-4454-b333-f70327da668c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" Apr 22 19:58:18.670482 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.670460 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4crxl\" (UniqueName: \"kubernetes.io/projected/e05b401e-86d3-4f58-ba83-8727ba2b2682-kube-api-access-4crxl\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:18.671930 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.671906 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fncd\" (UniqueName: \"kubernetes.io/projected/b589acde-4c45-4c44-be29-d3785bec1ccd-kube-api-access-8fncd\") pod \"service-ca-operator-d6fc45fc5-lgx98\" (UID: \"b589acde-4c45-4c44-be29-d3785bec1ccd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" Apr 22 19:58:18.677681 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.677662 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:18.712573 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.712540 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" Apr 22 19:58:18.727376 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.727343 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-nbb5m" Apr 22 19:58:18.735448 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.735406 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" Apr 22 19:58:18.758050 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.758026 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbq7v\" (UniqueName: \"kubernetes.io/projected/93e25588-349f-4ceb-bdeb-d27b0e71e171-kube-api-access-bbq7v\") pod \"volume-data-source-validator-7c6cbb6c87-zmn8h\" (UID: \"93e25588-349f-4ceb-bdeb-d27b0e71e171\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zmn8h" Apr 22 19:58:18.766477 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.766452 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbq7v\" (UniqueName: \"kubernetes.io/projected/93e25588-349f-4ceb-bdeb-d27b0e71e171-kube-api-access-bbq7v\") pod \"volume-data-source-validator-7c6cbb6c87-zmn8h\" (UID: \"93e25588-349f-4ceb-bdeb-d27b0e71e171\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zmn8h" Apr 22 19:58:18.792304 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.792276 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7ckn" Apr 22 19:58:18.808085 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:18.808052 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zmn8h" Apr 22 19:58:19.047670 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.044023 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:19.047670 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.044470 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:19.047670 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.047116 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qxcj4\"" Apr 22 19:58:19.047670 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.047377 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:58:19.047670 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.047564 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:58:19.064110 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.062029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6vcjz\" (UID: \"1ac0aac7-46c9-42f6-8aaa-e626360e1faa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:19.064110 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.063140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:19.064110 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.063183 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:19.064110 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.063327 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:19.064110 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.063343 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9: secret "image-registry-tls" not found Apr 22 19:58:19.064110 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.063427 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls podName:e9ed2f86-635a-4b6a-bb4c-a1b309537b91 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.063408782 +0000 UTC m=+34.594130770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls") pod "image-registry-6bdfcbd6fd-q8mx9" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91") : secret "image-registry-tls" not found Apr 22 19:58:19.064110 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.063578 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:19.064110 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.063635 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls podName:3b42faf8-dfc9-477e-a74b-abcef44beb8e nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.063619385 +0000 UTC m=+34.594341373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls") pod "dns-default-rgmkt" (UID: "3b42faf8-dfc9-477e-a74b-abcef44beb8e") : secret "dns-default-metrics-tls" not found Apr 22 19:58:19.064110 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.063689 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:58:19.064110 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.063720 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls podName:1ac0aac7-46c9-42f6-8aaa-e626360e1faa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.063710689 +0000 UTC m=+34.594432677 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6vcjz" (UID: "1ac0aac7-46c9-42f6-8aaa-e626360e1faa") : secret "samples-operator-tls" not found Apr 22 19:58:19.121794 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.121752 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-f7ckn"] Apr 22 19:58:19.131689 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.131660 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zmn8h"] Apr 22 19:58:19.137990 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.137921 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-lzbxz"] Apr 22 19:58:19.138892 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.138874 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-nbb5m"] Apr 22 19:58:19.144550 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:58:19.144518 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d5f54b_81d2_4910_b1b2_87b6f74fa261.slice/crio-b2aff71047ea46c6303042765025315095cc74ce3b4e2e0a2df3db4b8eca6c18 WatchSource:0}: Error finding container b2aff71047ea46c6303042765025315095cc74ce3b4e2e0a2df3db4b8eca6c18: Status 404 returned error can't find the container with id b2aff71047ea46c6303042765025315095cc74ce3b4e2e0a2df3db4b8eca6c18 Apr 22 19:58:19.145211 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:58:19.145148 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93e25588_349f_4ceb_bdeb_d27b0e71e171.slice/crio-44226324c485bf4b22639e6935a115a9f7e773490983f3a1e947d95f0593c67b WatchSource:0}: Error finding container 44226324c485bf4b22639e6935a115a9f7e773490983f3a1e947d95f0593c67b: Status 404 returned error can't find the container with id 44226324c485bf4b22639e6935a115a9f7e773490983f3a1e947d95f0593c67b Apr 22 19:58:19.146133 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:58:19.146117 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79df87a_c734_4d84_b239_5cfbd4266788.slice/crio-3ca0b7d0f45182367be35635019321bf84a567cd81bead4ba5c2dc49660269e7 WatchSource:0}: Error finding container 3ca0b7d0f45182367be35635019321bf84a567cd81bead4ba5c2dc49660269e7: Status 404 returned error can't find the container with id 3ca0b7d0f45182367be35635019321bf84a567cd81bead4ba5c2dc49660269e7 Apr 22 19:58:19.146745 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:58:19.146715 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10902c6d_77cd_4e80_86e7_28633566a0ee.slice/crio-8238f0de61b318261b5ef5111b856530a8ae4de8fcf67d5aa8b356475d859128 WatchSource:0}: Error finding container 8238f0de61b318261b5ef5111b856530a8ae4de8fcf67d5aa8b356475d859128: Status 404 returned error can't find the container with id 8238f0de61b318261b5ef5111b856530a8ae4de8fcf67d5aa8b356475d859128 Apr 22 19:58:19.149296 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.148026 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98"] Apr 22 19:58:19.150580 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.150545 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8"] Apr 22 19:58:19.154238 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:58:19.154216 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99eb7f40_e81e_4454_b333_f70327da668c.slice/crio-ade4e923119d62bdfedd5e8f75c4658c8cd9a986dc0a067c64b5c3675c87e274 WatchSource:0}: Error finding container ade4e923119d62bdfedd5e8f75c4658c8cd9a986dc0a067c64b5c3675c87e274: Status 404 returned error can't find the container with id ade4e923119d62bdfedd5e8f75c4658c8cd9a986dc0a067c64b5c3675c87e274 Apr 22 19:58:19.156574 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:58:19.156557 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb589acde_4c45_4c44_be29_d3785bec1ccd.slice/crio-e00174021c7c47116ae3793192543cbd273a6ad48d1c6cd4cb103f7ca7a752aa WatchSource:0}: Error finding container e00174021c7c47116ae3793192543cbd273a6ad48d1c6cd4cb103f7ca7a752aa: Status 404 returned error can't find the container with id e00174021c7c47116ae3793192543cbd273a6ad48d1c6cd4cb103f7ca7a752aa Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.164979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.165010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.165033 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5lsr2\" (UID: \"f0a675ce-7b40-4a85-8869-d492c9e0218d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.165103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert\") pod \"ingress-canary-8zwhn\" (UID: \"54257b88-b9cb-44e6-9885-78eb59be8c12\") " pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.165126 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.165131 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle podName:cd91dcdf-472d-4758-b11b-7e7b6d347fbd nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.165107202 +0000 UTC m=+34.695829190 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle") pod "router-default-748d997cd4-p2hd6" (UID: "cd91dcdf-472d-4758-b11b-7e7b6d347fbd") : configmap references non-existent config key: service-ca.crt Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.165169 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.165175 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs podName:cd91dcdf-472d-4758-b11b-7e7b6d347fbd nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.165159311 +0000 UTC m=+34.695881315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs") pod "router-default-748d997cd4-p2hd6" (UID: "cd91dcdf-472d-4758-b11b-7e7b6d347fbd") : secret "router-metrics-certs-default" not found Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.165190 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.165202 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert podName:54257b88-b9cb-44e6-9885-78eb59be8c12 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.165191725 +0000 UTC m=+34.695913714 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert") pod "ingress-canary-8zwhn" (UID: "54257b88-b9cb-44e6-9885-78eb59be8c12") : secret "canary-serving-cert" not found Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.165225 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert podName:f0a675ce-7b40-4a85-8869-d492c9e0218d nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.165214373 +0000 UTC m=+34.695936360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5lsr2" (UID: "f0a675ce-7b40-4a85-8869-d492c9e0218d") : secret "networking-console-plugin-cert" not found Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.165263 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.165412 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:19.165580 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.165451 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls podName:e05b401e-86d3-4f58-ba83-8727ba2b2682 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.165438604 +0000 UTC m=+34.696160594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fvkt4" (UID: "e05b401e-86d3-4f58-ba83-8727ba2b2682") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:19.227127 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.227095 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" event={"ID":"10902c6d-77cd-4e80-86e7-28633566a0ee","Type":"ContainerStarted","Data":"8238f0de61b318261b5ef5111b856530a8ae4de8fcf67d5aa8b356475d859128"} Apr 22 19:58:19.228199 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.228164 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" event={"ID":"99eb7f40-e81e-4454-b333-f70327da668c","Type":"ContainerStarted","Data":"ade4e923119d62bdfedd5e8f75c4658c8cd9a986dc0a067c64b5c3675c87e274"} Apr 22 19:58:19.229287 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.229250 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zmn8h" event={"ID":"93e25588-349f-4ceb-bdeb-d27b0e71e171","Type":"ContainerStarted","Data":"44226324c485bf4b22639e6935a115a9f7e773490983f3a1e947d95f0593c67b"} Apr 22 19:58:19.230339 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.230308 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-nbb5m" event={"ID":"c79df87a-c734-4d84-b239-5cfbd4266788","Type":"ContainerStarted","Data":"3ca0b7d0f45182367be35635019321bf84a567cd81bead4ba5c2dc49660269e7"} Apr 22 19:58:19.231369 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.231348 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" event={"ID":"b589acde-4c45-4c44-be29-d3785bec1ccd","Type":"ContainerStarted","Data":"e00174021c7c47116ae3793192543cbd273a6ad48d1c6cd4cb103f7ca7a752aa"} Apr 22 19:58:19.232460 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.232432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7ckn" event={"ID":"b0d5f54b-81d2-4910-b1b2-87b6f74fa261","Type":"ContainerStarted","Data":"b2aff71047ea46c6303042765025315095cc74ce3b4e2e0a2df3db4b8eca6c18"} Apr 22 19:58:19.669858 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.669558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:19.670058 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.669725 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:58:19.670058 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:19.669995 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs podName:802bd93c-03cf-435c-a223-487ff037f6c7 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:51.669974864 +0000 UTC m=+66.200696856 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs") pod "network-metrics-daemon-xbxhx" (UID: "802bd93c-03cf-435c-a223-487ff037f6c7") : secret "metrics-daemon-secret" not found Apr 22 19:58:19.771004 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.770915 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnzd\" (UniqueName: \"kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd\") pod \"network-check-target-rktp2\" (UID: \"0aaf6153-a940-4bb7-9f56-61f82d60b50d\") " pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:19.776672 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:19.776607 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtnzd\" (UniqueName: \"kubernetes.io/projected/0aaf6153-a940-4bb7-9f56-61f82d60b50d-kube-api-access-wtnzd\") pod \"network-check-target-rktp2\" (UID: \"0aaf6153-a940-4bb7-9f56-61f82d60b50d\") " pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:20.050526 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.050455 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:20.056886 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.054939 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tjpls\"" Apr 22 19:58:20.066344 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.066320 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:20.075207 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.074148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:20.075207 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.074202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:20.075207 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.074306 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6vcjz\" (UID: \"1ac0aac7-46c9-42f6-8aaa-e626360e1faa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:20.075207 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.074505 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:58:20.075207 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.074569 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls podName:1ac0aac7-46c9-42f6-8aaa-e626360e1faa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:22.074550444 +0000 UTC m=+36.605272456 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6vcjz" (UID: "1ac0aac7-46c9-42f6-8aaa-e626360e1faa") : secret "samples-operator-tls" not found Apr 22 19:58:20.075207 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.075008 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:20.075207 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.075058 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls podName:3b42faf8-dfc9-477e-a74b-abcef44beb8e nodeName:}" failed. No retries permitted until 2026-04-22 19:58:22.075041931 +0000 UTC m=+36.605763921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls") pod "dns-default-rgmkt" (UID: "3b42faf8-dfc9-477e-a74b-abcef44beb8e") : secret "dns-default-metrics-tls" not found Apr 22 19:58:20.075207 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.075128 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:20.075207 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.075139 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9: secret "image-registry-tls" not found Apr 22 19:58:20.075207 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.075171 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls podName:e9ed2f86-635a-4b6a-bb4c-a1b309537b91 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:22.075160891 +0000 UTC m=+36.605882883 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls") pod "image-registry-6bdfcbd6fd-q8mx9" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91") : secret "image-registry-tls" not found Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.175376 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.175428 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.175456 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5lsr2\" (UID: \"f0a675ce-7b40-4a85-8869-d492c9e0218d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.175485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert\") pod \"ingress-canary-8zwhn\" (UID: \"54257b88-b9cb-44e6-9885-78eb59be8c12\") " pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.175562 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.175792 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.175880 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls podName:e05b401e-86d3-4f58-ba83-8727ba2b2682 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:22.175860041 +0000 UTC m=+36.706582037 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fvkt4" (UID: "e05b401e-86d3-4f58-ba83-8727ba2b2682") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.175957 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle podName:cd91dcdf-472d-4758-b11b-7e7b6d347fbd nodeName:}" failed. No retries permitted until 2026-04-22 19:58:22.175945391 +0000 UTC m=+36.706667381 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle") pod "router-default-748d997cd4-p2hd6" (UID: "cd91dcdf-472d-4758-b11b-7e7b6d347fbd") : configmap references non-existent config key: service-ca.crt Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.176017 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.176048 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs podName:cd91dcdf-472d-4758-b11b-7e7b6d347fbd nodeName:}" failed. No retries permitted until 2026-04-22 19:58:22.176038147 +0000 UTC m=+36.706760140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs") pod "router-default-748d997cd4-p2hd6" (UID: "cd91dcdf-472d-4758-b11b-7e7b6d347fbd") : secret "router-metrics-certs-default" not found Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.176097 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.176128 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert podName:f0a675ce-7b40-4a85-8869-d492c9e0218d nodeName:}" failed. No retries permitted until 2026-04-22 19:58:22.176115788 +0000 UTC m=+36.706837787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5lsr2" (UID: "f0a675ce-7b40-4a85-8869-d492c9e0218d") : secret "networking-console-plugin-cert" not found Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.176184 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:20.176971 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:20.176212 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert podName:54257b88-b9cb-44e6-9885-78eb59be8c12 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:22.17620165 +0000 UTC m=+36.706923641 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert") pod "ingress-canary-8zwhn" (UID: "54257b88-b9cb-44e6-9885-78eb59be8c12") : secret "canary-serving-cert" not found Apr 22 19:58:20.243936 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.243361 2578 generic.go:358] "Generic (PLEG): container finished" podID="b7a6e97e-64c6-44de-9b0f-622a7b3a2316" containerID="8d6eacf6352adaf6cd40c9a93925ca97b241a2c3d373817f4377f9d4b0fec543" exitCode=0 Apr 22 19:58:20.243936 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.243442 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zh622" event={"ID":"b7a6e97e-64c6-44de-9b0f-622a7b3a2316","Type":"ContainerDied","Data":"8d6eacf6352adaf6cd40c9a93925ca97b241a2c3d373817f4377f9d4b0fec543"} Apr 22 19:58:20.243936 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:20.243476 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rktp2"] Apr 22 19:58:21.248952 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:21.248882 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rktp2" event={"ID":"0aaf6153-a940-4bb7-9f56-61f82d60b50d","Type":"ContainerStarted","Data":"7218a312a9641f547a7d40367b1b618a8885449db41dcbfd5c7f863a99e16e08"} Apr 22 19:58:21.257186 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:21.257152 2578 generic.go:358] "Generic (PLEG): container finished" podID="b7a6e97e-64c6-44de-9b0f-622a7b3a2316" containerID="593df75214eee7b197161b440b0be8cf7645f763df8a5cd7a4a6ed2330de6874" exitCode=0 Apr 22 19:58:21.257324 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:21.257211 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zh622" event={"ID":"b7a6e97e-64c6-44de-9b0f-622a7b3a2316","Type":"ContainerDied","Data":"593df75214eee7b197161b440b0be8cf7645f763df8a5cd7a4a6ed2330de6874"} Apr 22 19:58:22.095184 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:22.095153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:22.095377 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:22.095195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:22.095377 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:22.095273 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6vcjz\" (UID: \"1ac0aac7-46c9-42f6-8aaa-e626360e1faa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:22.095377 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.095309 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:22.095377 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.095374 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:58:22.095621 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.095382 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls podName:3b42faf8-dfc9-477e-a74b-abcef44beb8e nodeName:}" failed. No retries permitted until 2026-04-22 19:58:26.095365482 +0000 UTC m=+40.626087487 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls") pod "dns-default-rgmkt" (UID: "3b42faf8-dfc9-477e-a74b-abcef44beb8e") : secret "dns-default-metrics-tls" not found Apr 22 19:58:22.095621 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.095391 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:22.095621 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.095409 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9: secret "image-registry-tls" not found Apr 22 19:58:22.095621 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.095457 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls podName:e9ed2f86-635a-4b6a-bb4c-a1b309537b91 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:26.095438562 +0000 UTC m=+40.626160565 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls") pod "image-registry-6bdfcbd6fd-q8mx9" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91") : secret "image-registry-tls" not found Apr 22 19:58:22.095621 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.095615 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls podName:1ac0aac7-46c9-42f6-8aaa-e626360e1faa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:26.095464769 +0000 UTC m=+40.626186777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6vcjz" (UID: "1ac0aac7-46c9-42f6-8aaa-e626360e1faa") : secret "samples-operator-tls" not found Apr 22 19:58:22.195904 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:22.195870 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:22.196074 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:22.195922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:22.196074 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:22.195945 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5lsr2\" (UID: \"f0a675ce-7b40-4a85-8869-d492c9e0218d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:22.196074 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:22.195970 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert\") pod \"ingress-canary-8zwhn\" (UID: \"54257b88-b9cb-44e6-9885-78eb59be8c12\") " pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:22.196074 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:22.196025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:22.196279 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.196098 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle podName:cd91dcdf-472d-4758-b11b-7e7b6d347fbd nodeName:}" failed. No retries permitted until 2026-04-22 19:58:26.196079286 +0000 UTC m=+40.726801274 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle") pod "router-default-748d997cd4-p2hd6" (UID: "cd91dcdf-472d-4758-b11b-7e7b6d347fbd") : configmap references non-existent config key: service-ca.crt Apr 22 19:58:22.196279 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.196163 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:22.196279 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.196218 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls podName:e05b401e-86d3-4f58-ba83-8727ba2b2682 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:26.196202433 +0000 UTC m=+40.726924432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fvkt4" (UID: "e05b401e-86d3-4f58-ba83-8727ba2b2682") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:22.196279 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.196275 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:58:22.196505 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.196307 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs podName:cd91dcdf-472d-4758-b11b-7e7b6d347fbd nodeName:}" failed. No retries permitted until 2026-04-22 19:58:26.196296758 +0000 UTC m=+40.727018752 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs") pod "router-default-748d997cd4-p2hd6" (UID: "cd91dcdf-472d-4758-b11b-7e7b6d347fbd") : secret "router-metrics-certs-default" not found Apr 22 19:58:22.196505 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.196359 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:22.196505 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.196396 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert podName:f0a675ce-7b40-4a85-8869-d492c9e0218d nodeName:}" failed. No retries permitted until 2026-04-22 19:58:26.196383666 +0000 UTC m=+40.727105660 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5lsr2" (UID: "f0a675ce-7b40-4a85-8869-d492c9e0218d") : secret "networking-console-plugin-cert" not found Apr 22 19:58:22.196505 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.196445 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:22.196505 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:22.196484 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert podName:54257b88-b9cb-44e6-9885-78eb59be8c12 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:26.196473871 +0000 UTC m=+40.727195862 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert") pod "ingress-canary-8zwhn" (UID: "54257b88-b9cb-44e6-9885-78eb59be8c12") : secret "canary-serving-cert" not found Apr 22 19:58:26.141181 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.140338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6vcjz\" (UID: \"1ac0aac7-46c9-42f6-8aaa-e626360e1faa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:26.141181 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.140457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:26.141181 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.140494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:26.141181 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.140610 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:26.141181 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.140609 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:58:26.141181 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.140623 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9: secret "image-registry-tls" not found Apr 22 19:58:26.141181 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.140674 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls podName:1ac0aac7-46c9-42f6-8aaa-e626360e1faa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:34.140655104 +0000 UTC m=+48.671377094 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6vcjz" (UID: "1ac0aac7-46c9-42f6-8aaa-e626360e1faa") : secret "samples-operator-tls" not found Apr 22 19:58:26.141181 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.140692 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls podName:e9ed2f86-635a-4b6a-bb4c-a1b309537b91 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:34.140682616 +0000 UTC m=+48.671404609 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls") pod "image-registry-6bdfcbd6fd-q8mx9" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91") : secret "image-registry-tls" not found Apr 22 19:58:26.141181 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.140703 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:26.141181 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.140729 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls podName:3b42faf8-dfc9-477e-a74b-abcef44beb8e nodeName:}" failed. No retries permitted until 2026-04-22 19:58:34.140719547 +0000 UTC m=+48.671441534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls") pod "dns-default-rgmkt" (UID: "3b42faf8-dfc9-477e-a74b-abcef44beb8e") : secret "dns-default-metrics-tls" not found Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.241789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.241852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.241881 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5lsr2\" (UID: \"f0a675ce-7b40-4a85-8869-d492c9e0218d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.241908 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert\") pod \"ingress-canary-8zwhn\" (UID: \"54257b88-b9cb-44e6-9885-78eb59be8c12\") " pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.241965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.242119 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.242166 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls podName:e05b401e-86d3-4f58-ba83-8727ba2b2682 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:34.242149985 +0000 UTC m=+48.772871972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fvkt4" (UID: "e05b401e-86d3-4f58-ba83-8727ba2b2682") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.242222 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle podName:cd91dcdf-472d-4758-b11b-7e7b6d347fbd nodeName:}" failed. No retries permitted until 2026-04-22 19:58:34.242215154 +0000 UTC m=+48.772937141 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle") pod "router-default-748d997cd4-p2hd6" (UID: "cd91dcdf-472d-4758-b11b-7e7b6d347fbd") : configmap references non-existent config key: service-ca.crt Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.242269 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.242290 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs podName:cd91dcdf-472d-4758-b11b-7e7b6d347fbd nodeName:}" failed. No retries permitted until 2026-04-22 19:58:34.242283896 +0000 UTC m=+48.773005883 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs") pod "router-default-748d997cd4-p2hd6" (UID: "cd91dcdf-472d-4758-b11b-7e7b6d347fbd") : secret "router-metrics-certs-default" not found Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.242330 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.242351 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert podName:f0a675ce-7b40-4a85-8869-d492c9e0218d nodeName:}" failed. No retries permitted until 2026-04-22 19:58:34.242344886 +0000 UTC m=+48.773066872 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5lsr2" (UID: "f0a675ce-7b40-4a85-8869-d492c9e0218d") : secret "networking-console-plugin-cert" not found Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.242383 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:26.242488 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:26.242407 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert podName:54257b88-b9cb-44e6-9885-78eb59be8c12 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:34.242398666 +0000 UTC m=+48.773120655 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert") pod "ingress-canary-8zwhn" (UID: "54257b88-b9cb-44e6-9885-78eb59be8c12") : secret "canary-serving-cert" not found Apr 22 19:58:26.272909 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.272346 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zmn8h" event={"ID":"93e25588-349f-4ceb-bdeb-d27b0e71e171","Type":"ContainerStarted","Data":"5aeb69529f76ebe169aee6667d931bd950e3af7a1f2a1bd65214577a8eeba52d"} Apr 22 19:58:26.274547 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.274477 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-nbb5m" event={"ID":"c79df87a-c734-4d84-b239-5cfbd4266788","Type":"ContainerStarted","Data":"c60e9425acd680c3e1a28d4bfb8fa68d19c071fc1938d13fb9ba03f129c6f0ab"} Apr 22 19:58:26.276535 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.276507 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" event={"ID":"b589acde-4c45-4c44-be29-d3785bec1ccd","Type":"ContainerStarted","Data":"56582f7b1171be02c38bdb2f860fbe56279dcf33737b2768653184d88212df94"} Apr 22 19:58:26.278165 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.278140 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7ckn" event={"ID":"b0d5f54b-81d2-4910-b1b2-87b6f74fa261","Type":"ContainerStarted","Data":"b162561f780fc3aacf191f4fa75c2cb3042d33206752971e1bb96c1829cd3541"} Apr 22 19:58:26.280024 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.280001 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rktp2" event={"ID":"0aaf6153-a940-4bb7-9f56-61f82d60b50d","Type":"ContainerStarted","Data":"4e224dc1148db2f3bfd8cdf8bf7c6867ee564275f15202d248067ba0f3628ef6"} Apr 22 19:58:26.280406 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.280389 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:26.287155 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.287128 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zh622" event={"ID":"b7a6e97e-64c6-44de-9b0f-622a7b3a2316","Type":"ContainerStarted","Data":"9092cda02912b45bfc36fa5189271f21a4bc86e1ff451d162a2bcc741c509fc9"} Apr 22 19:58:26.287573 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.287459 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zmn8h" podStartSLOduration=21.562108141 podStartE2EDuration="28.28744148s" podCreationTimestamp="2026-04-22 19:57:58 +0000 UTC" firstStartedPulling="2026-04-22 19:58:19.160029197 +0000 UTC m=+33.690751198" lastFinishedPulling="2026-04-22 19:58:25.885362535 +0000 UTC m=+40.416084537" observedRunningTime="2026-04-22 19:58:26.286247634 +0000 UTC m=+40.816969645" watchObservedRunningTime="2026-04-22 19:58:26.28744148 +0000 UTC m=+40.818163490" Apr 22 19:58:26.289544 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.289520 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" event={"ID":"10902c6d-77cd-4e80-86e7-28633566a0ee","Type":"ContainerStarted","Data":"eacd61fe1fcb177dca51aa1effd059387ea22c12b3d8f7921143d2639852dc19"} Apr 22 19:58:26.291466 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.290353 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:26.291771 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.291646 2578 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-lzbxz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.132.0.8:8443/readyz\": dial tcp 10.132.0.8:8443: connect: connection refused" start-of-body= Apr 22 19:58:26.291771 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.291694 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" podUID="10902c6d-77cd-4e80-86e7-28633566a0ee" containerName="console-operator" probeResult="failure" output="Get \"https://10.132.0.8:8443/readyz\": dial tcp 10.132.0.8:8443: connect: connection refused" Apr 22 19:58:26.291942 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.291840 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" event={"ID":"99eb7f40-e81e-4454-b333-f70327da668c","Type":"ContainerStarted","Data":"949f0160f9feae05cf45475df8ae4beacf256dc42525838ec54790a0b8487c5d"} Apr 22 19:58:26.303363 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.303304 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" podStartSLOduration=21.577972404 podStartE2EDuration="28.303271061s" podCreationTimestamp="2026-04-22 19:57:58 +0000 UTC" firstStartedPulling="2026-04-22 19:58:19.160049821 +0000 UTC m=+33.690771810" lastFinishedPulling="2026-04-22 19:58:25.88534848 +0000 UTC m=+40.416070467" observedRunningTime="2026-04-22 19:58:26.302790664 +0000 UTC m=+40.833512673" watchObservedRunningTime="2026-04-22 19:58:26.303271061 +0000 UTC m=+40.833993070" Apr 22 19:58:26.322141 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.322085 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-f7ckn" podStartSLOduration=21.597032132 podStartE2EDuration="28.322066198s" podCreationTimestamp="2026-04-22 19:57:58 +0000 UTC" firstStartedPulling="2026-04-22 19:58:19.160034454 +0000 UTC m=+33.690756452" lastFinishedPulling="2026-04-22 19:58:25.885068515 +0000 UTC m=+40.415790518" observedRunningTime="2026-04-22 19:58:26.319980989 +0000 UTC m=+40.850703027" watchObservedRunningTime="2026-04-22 19:58:26.322066198 +0000 UTC m=+40.852788210" Apr 22 19:58:26.346167 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.345853 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-nbb5m" podStartSLOduration=21.617264376 podStartE2EDuration="28.345833747s" podCreationTimestamp="2026-04-22 19:57:58 +0000 UTC" firstStartedPulling="2026-04-22 19:58:19.160021497 +0000 UTC m=+33.690743488" lastFinishedPulling="2026-04-22 19:58:25.888590853 +0000 UTC m=+40.419312859" observedRunningTime="2026-04-22 19:58:26.345138521 +0000 UTC m=+40.875860544" watchObservedRunningTime="2026-04-22 19:58:26.345833747 +0000 UTC m=+40.876555751" Apr 22 19:58:26.362144 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.360695 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rktp2" podStartSLOduration=34.480527795 podStartE2EDuration="40.360658084s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="2026-04-22 19:58:20.254133017 +0000 UTC m=+34.784855019" lastFinishedPulling="2026-04-22 19:58:26.134263318 +0000 UTC m=+40.664985308" observedRunningTime="2026-04-22 19:58:26.358920058 +0000 UTC m=+40.889642076" watchObservedRunningTime="2026-04-22 19:58:26.360658084 +0000 UTC m=+40.891380074" Apr 22 19:58:26.375572 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.375016 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" podStartSLOduration=21.650769464 podStartE2EDuration="28.374997318s" podCreationTimestamp="2026-04-22 19:57:58 +0000 UTC" firstStartedPulling="2026-04-22 19:58:19.161332239 +0000 UTC m=+33.692054226" lastFinishedPulling="2026-04-22 19:58:25.885560092 +0000 UTC m=+40.416282080" observedRunningTime="2026-04-22 19:58:26.373951431 +0000 UTC m=+40.904673441" watchObservedRunningTime="2026-04-22 19:58:26.374997318 +0000 UTC m=+40.905719330" Apr 22 19:58:26.397830 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.397722 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" podStartSLOduration=21.671771709 podStartE2EDuration="28.397701545s" podCreationTimestamp="2026-04-22 19:57:58 +0000 UTC" firstStartedPulling="2026-04-22 19:58:19.160303654 +0000 UTC m=+33.691025641" lastFinishedPulling="2026-04-22 19:58:25.88623349 +0000 UTC m=+40.416955477" observedRunningTime="2026-04-22 19:58:26.395772502 +0000 UTC m=+40.926494512" watchObservedRunningTime="2026-04-22 19:58:26.397701545 +0000 UTC m=+40.928423555" Apr 22 19:58:26.418375 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:26.418309 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zh622" podStartSLOduration=9.820298738 podStartE2EDuration="40.418287182s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="2026-04-22 19:57:48.586184031 +0000 UTC m=+3.116906035" lastFinishedPulling="2026-04-22 19:58:19.184172486 +0000 UTC m=+33.714894479" observedRunningTime="2026-04-22 19:58:26.415253152 +0000 UTC m=+40.945975163" watchObservedRunningTime="2026-04-22 19:58:26.418287182 +0000 UTC m=+40.949009183" Apr 22 19:58:27.296722 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:27.296637 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/0.log" Apr 22 19:58:27.296722 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:27.296687 2578 generic.go:358] "Generic (PLEG): container finished" podID="10902c6d-77cd-4e80-86e7-28633566a0ee" containerID="eacd61fe1fcb177dca51aa1effd059387ea22c12b3d8f7921143d2639852dc19" exitCode=255 Apr 22 19:58:27.297312 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:27.296835 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" event={"ID":"10902c6d-77cd-4e80-86e7-28633566a0ee","Type":"ContainerDied","Data":"eacd61fe1fcb177dca51aa1effd059387ea22c12b3d8f7921143d2639852dc19"} Apr 22 19:58:27.297312 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:27.297174 2578 scope.go:117] "RemoveContainer" containerID="eacd61fe1fcb177dca51aa1effd059387ea22c12b3d8f7921143d2639852dc19" Apr 22 19:58:27.781491 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:27.781448 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-9ggs9"] Apr 22 19:58:27.806504 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:27.806469 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-9ggs9"] Apr 22 19:58:27.806671 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:27.806605 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9ggs9" Apr 22 19:58:27.809008 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:27.808980 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:27.809008 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:27.808995 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-xqkq9\"" Apr 22 19:58:27.809178 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:27.808994 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 19:58:27.962300 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:27.962256 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z69mc\" (UniqueName: \"kubernetes.io/projected/e7f9c124-ab6b-4350-ae69-f352a89d7122-kube-api-access-z69mc\") pod \"migrator-74bb7799d9-9ggs9\" (UID: \"e7f9c124-ab6b-4350-ae69-f352a89d7122\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9ggs9" Apr 22 19:58:28.063871 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.063724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z69mc\" (UniqueName: \"kubernetes.io/projected/e7f9c124-ab6b-4350-ae69-f352a89d7122-kube-api-access-z69mc\") pod \"migrator-74bb7799d9-9ggs9\" (UID: \"e7f9c124-ab6b-4350-ae69-f352a89d7122\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9ggs9" Apr 22 19:58:28.077400 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.077363 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z69mc\" (UniqueName: \"kubernetes.io/projected/e7f9c124-ab6b-4350-ae69-f352a89d7122-kube-api-access-z69mc\") pod \"migrator-74bb7799d9-9ggs9\" (UID: \"e7f9c124-ab6b-4350-ae69-f352a89d7122\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9ggs9" Apr 22 19:58:28.117179 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.117139 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9ggs9" Apr 22 19:58:28.236674 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.236640 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-9ggs9"] Apr 22 19:58:28.239582 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:58:28.239550 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7f9c124_ab6b_4350_ae69_f352a89d7122.slice/crio-d0ea1b42728354b6ad707816183625bcb2121e8ec42a95464a1ba7de35c23063 WatchSource:0}: Error finding container d0ea1b42728354b6ad707816183625bcb2121e8ec42a95464a1ba7de35c23063: Status 404 returned error can't find the container with id d0ea1b42728354b6ad707816183625bcb2121e8ec42a95464a1ba7de35c23063 Apr 22 19:58:28.301896 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.301862 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/1.log" Apr 22 19:58:28.302335 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.302318 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/0.log" Apr 22 19:58:28.302421 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.302356 2578 generic.go:358] "Generic (PLEG): container finished" podID="10902c6d-77cd-4e80-86e7-28633566a0ee" containerID="f314260b40533f4ccdcbeace692d47e4eb9586f10f97ea457d459279bfd7750b" exitCode=255 Apr 22 19:58:28.302421 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.302387 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" event={"ID":"10902c6d-77cd-4e80-86e7-28633566a0ee","Type":"ContainerDied","Data":"f314260b40533f4ccdcbeace692d47e4eb9586f10f97ea457d459279bfd7750b"} Apr 22 19:58:28.302520 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.302426 2578 scope.go:117] "RemoveContainer" containerID="eacd61fe1fcb177dca51aa1effd059387ea22c12b3d8f7921143d2639852dc19" Apr 22 19:58:28.302724 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.302708 2578 scope.go:117] "RemoveContainer" containerID="f314260b40533f4ccdcbeace692d47e4eb9586f10f97ea457d459279bfd7750b" Apr 22 19:58:28.302977 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:28.302949 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lzbxz_openshift-console-operator(10902c6d-77cd-4e80-86e7-28633566a0ee)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" podUID="10902c6d-77cd-4e80-86e7-28633566a0ee" Apr 22 19:58:28.303675 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.303546 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9ggs9" event={"ID":"e7f9c124-ab6b-4350-ae69-f352a89d7122","Type":"ContainerStarted","Data":"d0ea1b42728354b6ad707816183625bcb2121e8ec42a95464a1ba7de35c23063"} Apr 22 19:58:28.366201 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.366148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:28.368654 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.368624 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/88e758f9-14ca-4081-b67d-e9de91d6ddf6-original-pull-secret\") pod \"global-pull-secret-syncer-xpbrx\" (UID: \"88e758f9-14ca-4081-b67d-e9de91d6ddf6\") " pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:28.668438 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.668349 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xpbrx" Apr 22 19:58:28.678257 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.678228 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:28.789414 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.789383 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xpbrx"] Apr 22 19:58:28.792555 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:58:28.792526 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88e758f9_14ca_4081_b67d_e9de91d6ddf6.slice/crio-04f1e60faf0aa98d17a8fdfe7abc267553952094898f4dd37dd96739bca42ae6 WatchSource:0}: Error finding container 04f1e60faf0aa98d17a8fdfe7abc267553952094898f4dd37dd96739bca42ae6: Status 404 returned error can't find the container with id 04f1e60faf0aa98d17a8fdfe7abc267553952094898f4dd37dd96739bca42ae6 Apr 22 19:58:28.825287 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:28.825259 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qzz8z_f4ad43cf-a292-44ff-a1ae-9d139860c9cc/dns-node-resolver/0.log" Apr 22 19:58:29.308007 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:29.307974 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/1.log" Apr 22 19:58:29.308456 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:29.308387 2578 scope.go:117] "RemoveContainer" containerID="f314260b40533f4ccdcbeace692d47e4eb9586f10f97ea457d459279bfd7750b" Apr 22 19:58:29.308629 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:29.308604 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lzbxz_openshift-console-operator(10902c6d-77cd-4e80-86e7-28633566a0ee)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" podUID="10902c6d-77cd-4e80-86e7-28633566a0ee" Apr 22 19:58:29.309378 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:29.309355 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xpbrx" event={"ID":"88e758f9-14ca-4081-b67d-e9de91d6ddf6","Type":"ContainerStarted","Data":"04f1e60faf0aa98d17a8fdfe7abc267553952094898f4dd37dd96739bca42ae6"} Apr 22 19:58:29.828757 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:29.828723 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pjssp_8501acc2-dabe-4f52-9b02-ba92e386acb7/node-ca/0.log" Apr 22 19:58:30.312552 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:30.312523 2578 scope.go:117] "RemoveContainer" containerID="f314260b40533f4ccdcbeace692d47e4eb9586f10f97ea457d459279bfd7750b" Apr 22 19:58:30.312978 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:30.312720 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lzbxz_openshift-console-operator(10902c6d-77cd-4e80-86e7-28633566a0ee)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" podUID="10902c6d-77cd-4e80-86e7-28633566a0ee" Apr 22 19:58:31.318015 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:31.317974 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9ggs9" event={"ID":"e7f9c124-ab6b-4350-ae69-f352a89d7122","Type":"ContainerStarted","Data":"4e06abf045a6fbe8365f5d246e459e2cba2f26e07ca2c71977e9d5a377463724"} Apr 22 19:58:31.318015 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:31.318017 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9ggs9" event={"ID":"e7f9c124-ab6b-4350-ae69-f352a89d7122","Type":"ContainerStarted","Data":"0683b4172930216d2d989eb06a3db4b59ee4cfe21f641a078b853eb97b1585d2"} Apr 22 19:58:31.341181 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:31.341115 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9ggs9" podStartSLOduration=2.227773506 podStartE2EDuration="4.341095243s" podCreationTimestamp="2026-04-22 19:58:27 +0000 UTC" firstStartedPulling="2026-04-22 19:58:28.241519185 +0000 UTC m=+42.772241172" lastFinishedPulling="2026-04-22 19:58:30.354840918 +0000 UTC m=+44.885562909" observedRunningTime="2026-04-22 19:58:31.339844279 +0000 UTC m=+45.870566288" watchObservedRunningTime="2026-04-22 19:58:31.341095243 +0000 UTC m=+45.871817253" Apr 22 19:58:33.327378 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:33.327292 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xpbrx" event={"ID":"88e758f9-14ca-4081-b67d-e9de91d6ddf6","Type":"ContainerStarted","Data":"56bd78bdc4487a661ea0cc48428b840868f9e0a4b86f4d6a05fc3206742d29fd"} Apr 22 19:58:33.343671 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:33.343623 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xpbrx" podStartSLOduration=33.280319586 podStartE2EDuration="37.343606933s" podCreationTimestamp="2026-04-22 19:57:56 +0000 UTC" firstStartedPulling="2026-04-22 19:58:28.794258449 +0000 UTC m=+43.324980436" lastFinishedPulling="2026-04-22 19:58:32.857545796 +0000 UTC m=+47.388267783" observedRunningTime="2026-04-22 19:58:33.34274503 +0000 UTC m=+47.873467082" watchObservedRunningTime="2026-04-22 19:58:33.343606933 +0000 UTC m=+47.874328991" Apr 22 19:58:34.221445 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:34.221408 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:34.221445 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:34.221462 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:34.221717 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:34.221557 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6vcjz\" (UID: \"1ac0aac7-46c9-42f6-8aaa-e626360e1faa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:34.221717 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.221567 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:34.221717 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.221630 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:34.221717 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.221648 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9: secret "image-registry-tls" not found Apr 22 19:58:34.221717 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.221635 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls podName:3b42faf8-dfc9-477e-a74b-abcef44beb8e nodeName:}" failed. No retries permitted until 2026-04-22 19:58:50.221620322 +0000 UTC m=+64.752342313 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls") pod "dns-default-rgmkt" (UID: "3b42faf8-dfc9-477e-a74b-abcef44beb8e") : secret "dns-default-metrics-tls" not found Apr 22 19:58:34.221717 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.221634 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:58:34.221717 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.221713 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls podName:e9ed2f86-635a-4b6a-bb4c-a1b309537b91 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:50.221695613 +0000 UTC m=+64.752417607 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls") pod "image-registry-6bdfcbd6fd-q8mx9" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91") : secret "image-registry-tls" not found Apr 22 19:58:34.222051 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.221742 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls podName:1ac0aac7-46c9-42f6-8aaa-e626360e1faa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:50.221732881 +0000 UTC m=+64.752454874 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6vcjz" (UID: "1ac0aac7-46c9-42f6-8aaa-e626360e1faa") : secret "samples-operator-tls" not found Apr 22 19:58:34.322878 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:34.322842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:34.322878 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:34.322878 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:34.323123 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:34.322903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5lsr2\" (UID: \"f0a675ce-7b40-4a85-8869-d492c9e0218d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:34.323123 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:34.322922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert\") pod \"ingress-canary-8zwhn\" (UID: \"54257b88-b9cb-44e6-9885-78eb59be8c12\") " pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:34.323123 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:34.322957 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:34.323123 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.323014 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle podName:cd91dcdf-472d-4758-b11b-7e7b6d347fbd nodeName:}" failed. No retries permitted until 2026-04-22 19:58:50.322996288 +0000 UTC m=+64.853718275 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle") pod "router-default-748d997cd4-p2hd6" (UID: "cd91dcdf-472d-4758-b11b-7e7b6d347fbd") : configmap references non-existent config key: service-ca.crt Apr 22 19:58:34.323123 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.323012 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:34.323123 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.323012 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:58:34.323123 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.323058 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert podName:f0a675ce-7b40-4a85-8869-d492c9e0218d nodeName:}" failed. No retries permitted until 2026-04-22 19:58:50.323051015 +0000 UTC m=+64.853773001 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-5lsr2" (UID: "f0a675ce-7b40-4a85-8869-d492c9e0218d") : secret "networking-console-plugin-cert" not found Apr 22 19:58:34.323123 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.323070 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:34.323123 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.323115 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:34.323447 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.323124 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls podName:e05b401e-86d3-4f58-ba83-8727ba2b2682 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:50.323108775 +0000 UTC m=+64.853830762 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-fvkt4" (UID: "e05b401e-86d3-4f58-ba83-8727ba2b2682") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:34.323447 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.323197 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs podName:cd91dcdf-472d-4758-b11b-7e7b6d347fbd nodeName:}" failed. No retries permitted until 2026-04-22 19:58:50.32317934 +0000 UTC m=+64.853901342 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs") pod "router-default-748d997cd4-p2hd6" (UID: "cd91dcdf-472d-4758-b11b-7e7b6d347fbd") : secret "router-metrics-certs-default" not found Apr 22 19:58:34.323447 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:34.323221 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert podName:54257b88-b9cb-44e6-9885-78eb59be8c12 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:50.323211725 +0000 UTC m=+64.853933729 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert") pod "ingress-canary-8zwhn" (UID: "54257b88-b9cb-44e6-9885-78eb59be8c12") : secret "canary-serving-cert" not found Apr 22 19:58:36.290975 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:36.290895 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:36.291316 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:36.291270 2578 scope.go:117] "RemoveContainer" containerID="f314260b40533f4ccdcbeace692d47e4eb9586f10f97ea457d459279bfd7750b" Apr 22 19:58:36.291460 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:36.291444 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-lzbxz_openshift-console-operator(10902c6d-77cd-4e80-86e7-28633566a0ee)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" podUID="10902c6d-77cd-4e80-86e7-28633566a0ee" Apr 22 19:58:44.227052 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:44.227022 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ncpz" Apr 22 19:58:49.042857 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:49.042794 2578 scope.go:117] "RemoveContainer" containerID="f314260b40533f4ccdcbeace692d47e4eb9586f10f97ea457d459279bfd7750b" Apr 22 19:58:49.364484 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:49.364458 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/2.log" Apr 22 19:58:49.364842 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:49.364827 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/1.log" Apr 22 19:58:49.364898 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:49.364860 2578 generic.go:358] "Generic (PLEG): container finished" podID="10902c6d-77cd-4e80-86e7-28633566a0ee" containerID="e003022a2cc7a18ef4dba514b7f0b04916db06ce89ab7df165380cbee2dfe4f0" exitCode=255 Apr 22 19:58:49.364898 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:49.364887 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" event={"ID":"10902c6d-77cd-4e80-86e7-28633566a0ee","Type":"ContainerDied","Data":"e003022a2cc7a18ef4dba514b7f0b04916db06ce89ab7df165380cbee2dfe4f0"} Apr 22 19:58:49.364994 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:49.364914 2578 scope.go:117] "RemoveContainer" containerID="f314260b40533f4ccdcbeace692d47e4eb9586f10f97ea457d459279bfd7750b" Apr 22 19:58:49.365236 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:49.365218 2578 scope.go:117] "RemoveContainer" containerID="e003022a2cc7a18ef4dba514b7f0b04916db06ce89ab7df165380cbee2dfe4f0" Apr 22 19:58:49.365434 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:49.365414 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-lzbxz_openshift-console-operator(10902c6d-77cd-4e80-86e7-28633566a0ee)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" podUID="10902c6d-77cd-4e80-86e7-28633566a0ee" Apr 22 19:58:50.262169 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.262135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6vcjz\" (UID: \"1ac0aac7-46c9-42f6-8aaa-e626360e1faa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:50.262535 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.262206 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:50.262535 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.262227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:50.264818 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.264788 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b42faf8-dfc9-477e-a74b-abcef44beb8e-metrics-tls\") pod \"dns-default-rgmkt\" (UID: \"3b42faf8-dfc9-477e-a74b-abcef44beb8e\") " pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:50.264895 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.264794 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ac0aac7-46c9-42f6-8aaa-e626360e1faa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6vcjz\" (UID: \"1ac0aac7-46c9-42f6-8aaa-e626360e1faa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:50.264895 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.264795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls\") pod \"image-registry-6bdfcbd6fd-q8mx9\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:50.363488 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.363454 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:50.363488 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.363490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:50.363727 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.363507 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5lsr2\" (UID: \"f0a675ce-7b40-4a85-8869-d492c9e0218d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:50.363727 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.363526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert\") pod \"ingress-canary-8zwhn\" (UID: \"54257b88-b9cb-44e6-9885-78eb59be8c12\") " pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:50.363727 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.363555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:50.364204 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.364177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-service-ca-bundle\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:50.366306 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.366277 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f0a675ce-7b40-4a85-8869-d492c9e0218d-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5lsr2\" (UID: \"f0a675ce-7b40-4a85-8869-d492c9e0218d\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:50.366401 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.366284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54257b88-b9cb-44e6-9885-78eb59be8c12-cert\") pod \"ingress-canary-8zwhn\" (UID: \"54257b88-b9cb-44e6-9885-78eb59be8c12\") " pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:50.366718 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.366694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd91dcdf-472d-4758-b11b-7e7b6d347fbd-metrics-certs\") pod \"router-default-748d997cd4-p2hd6\" (UID: \"cd91dcdf-472d-4758-b11b-7e7b6d347fbd\") " pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:50.366847 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.366754 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e05b401e-86d3-4f58-ba83-8727ba2b2682-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-fvkt4\" (UID: \"e05b401e-86d3-4f58-ba83-8727ba2b2682\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:50.369008 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.368990 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/2.log" Apr 22 19:58:50.451950 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.451920 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-md5kj\"" Apr 22 19:58:50.459944 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.459922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:50.460931 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.460912 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bbzb4\"" Apr 22 19:58:50.469959 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.469934 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" Apr 22 19:58:50.488911 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.488890 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9r99g\"" Apr 22 19:58:50.496891 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.496859 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:50.506917 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.506893 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nfggw\"" Apr 22 19:58:50.514897 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.514871 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" Apr 22 19:58:50.558136 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.557885 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-pwnl8\"" Apr 22 19:58:50.566321 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.566297 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:50.566909 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.566870 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-t4wtg\"" Apr 22 19:58:50.575163 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.575112 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" Apr 22 19:58:50.579539 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.579286 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xt4d8\"" Apr 22 19:58:50.586673 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.585945 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8zwhn" Apr 22 19:58:50.615990 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.615956 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9"] Apr 22 19:58:50.647429 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.647193 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz"] Apr 22 19:58:50.709781 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.709736 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rgmkt"] Apr 22 19:58:50.730683 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.730616 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4"] Apr 22 19:58:50.808098 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:50.808061 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-748d997cd4-p2hd6"] Apr 22 19:58:50.813255 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:58:50.813227 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd91dcdf_472d_4758_b11b_7e7b6d347fbd.slice/crio-1e3da9f832b3b60b30e5e5c6f92439295ac9d4187860c31a52f78b588e90ccf4 WatchSource:0}: Error finding container 1e3da9f832b3b60b30e5e5c6f92439295ac9d4187860c31a52f78b588e90ccf4: Status 404 returned error can't find the container with id 1e3da9f832b3b60b30e5e5c6f92439295ac9d4187860c31a52f78b588e90ccf4 Apr 22 19:58:51.018155 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.018072 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fxfbt"] Apr 22 19:58:51.031996 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.031963 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.034517 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.034435 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:58:51.034719 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.034703 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-h76wn\"" Apr 22 19:58:51.034791 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.034718 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:58:51.040962 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.040786 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8zwhn"] Apr 22 19:58:51.045058 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.044981 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2"] Apr 22 19:58:51.050690 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.050665 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fxfbt"] Apr 22 19:58:51.170507 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.170377 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0f7244af-0f0f-4ec7-9a81-fd42125846a4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.170507 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.170424 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0f7244af-0f0f-4ec7-9a81-fd42125846a4-data-volume\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.170507 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.170442 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0f7244af-0f0f-4ec7-9a81-fd42125846a4-crio-socket\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.170507 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.170471 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0f7244af-0f0f-4ec7-9a81-fd42125846a4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.170901 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.170541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9ldk\" (UniqueName: \"kubernetes.io/projected/0f7244af-0f0f-4ec7-9a81-fd42125846a4-kube-api-access-p9ldk\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.274884 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.273176 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9ldk\" (UniqueName: \"kubernetes.io/projected/0f7244af-0f0f-4ec7-9a81-fd42125846a4-kube-api-access-p9ldk\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.274884 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.273274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0f7244af-0f0f-4ec7-9a81-fd42125846a4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.274884 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.273318 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0f7244af-0f0f-4ec7-9a81-fd42125846a4-data-volume\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.274884 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.273343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0f7244af-0f0f-4ec7-9a81-fd42125846a4-crio-socket\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.274884 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.273372 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0f7244af-0f0f-4ec7-9a81-fd42125846a4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.274884 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.274381 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0f7244af-0f0f-4ec7-9a81-fd42125846a4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.275699 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.275627 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0f7244af-0f0f-4ec7-9a81-fd42125846a4-crio-socket\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.276132 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.276073 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0f7244af-0f0f-4ec7-9a81-fd42125846a4-data-volume\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.276551 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.276514 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0f7244af-0f0f-4ec7-9a81-fd42125846a4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.287709 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.287686 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9ldk\" (UniqueName: \"kubernetes.io/projected/0f7244af-0f0f-4ec7-9a81-fd42125846a4-kube-api-access-p9ldk\") pod \"insights-runtime-extractor-fxfbt\" (UID: \"0f7244af-0f0f-4ec7-9a81-fd42125846a4\") " pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.352766 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.352738 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fxfbt" Apr 22 19:58:51.376302 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.376240 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" event={"ID":"f0a675ce-7b40-4a85-8869-d492c9e0218d","Type":"ContainerStarted","Data":"ff254e9a432656f9553e121712530485f37b1a3bf1c350093f59f19a7385d8c5"} Apr 22 19:58:51.379240 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.379179 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" event={"ID":"e9ed2f86-635a-4b6a-bb4c-a1b309537b91","Type":"ContainerStarted","Data":"d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba"} Apr 22 19:58:51.379240 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.379212 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" event={"ID":"e9ed2f86-635a-4b6a-bb4c-a1b309537b91","Type":"ContainerStarted","Data":"1b13bc2701b255a6baee906b0df64c97f2940aca99e4e1013917c0f435b473a5"} Apr 22 19:58:51.379752 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.379725 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:58:51.383198 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.383163 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" event={"ID":"e05b401e-86d3-4f58-ba83-8727ba2b2682","Type":"ContainerStarted","Data":"e6abb7fdff2c9a1c8cadf48c86fe2c16030d6ec98be0c84d4e843195040f29cf"} Apr 22 19:58:51.386940 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.386898 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rgmkt" event={"ID":"3b42faf8-dfc9-477e-a74b-abcef44beb8e","Type":"ContainerStarted","Data":"2d0e7c1ecde6a063ec954e8eed139d1c1e1b0121125115c5e11616d22434193a"} Apr 22 19:58:51.389276 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.389229 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" event={"ID":"1ac0aac7-46c9-42f6-8aaa-e626360e1faa","Type":"ContainerStarted","Data":"ee30ca91fdc65cde6b14caffee65994a362598d6dd341e3f355ad73a17b9763f"} Apr 22 19:58:51.390628 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.390578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8zwhn" event={"ID":"54257b88-b9cb-44e6-9885-78eb59be8c12","Type":"ContainerStarted","Data":"b704f7ee631751b5edfdea2e1b109f36577f375af0d13489d7acf114659f6a87"} Apr 22 19:58:51.392895 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.392348 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-748d997cd4-p2hd6" event={"ID":"cd91dcdf-472d-4758-b11b-7e7b6d347fbd","Type":"ContainerStarted","Data":"c6a4300574a418c045f9ade12851aa7d0e2094df21c8d558b992540073e565a9"} Apr 22 19:58:51.392895 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.392374 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-748d997cd4-p2hd6" event={"ID":"cd91dcdf-472d-4758-b11b-7e7b6d347fbd","Type":"ContainerStarted","Data":"1e3da9f832b3b60b30e5e5c6f92439295ac9d4187860c31a52f78b588e90ccf4"} Apr 22 19:58:51.400419 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.400197 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" podStartSLOduration=65.400182769 podStartE2EDuration="1m5.400182769s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:51.398390567 +0000 UTC m=+65.929112657" watchObservedRunningTime="2026-04-22 19:58:51.400182769 +0000 UTC m=+65.930904778" Apr 22 19:58:51.419116 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.419063 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-748d997cd4-p2hd6" podStartSLOduration=53.419044456 podStartE2EDuration="53.419044456s" podCreationTimestamp="2026-04-22 19:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:51.417750654 +0000 UTC m=+65.948472851" watchObservedRunningTime="2026-04-22 19:58:51.419044456 +0000 UTC m=+65.949766471" Apr 22 19:58:51.510961 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.510695 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fxfbt"] Apr 22 19:58:51.515631 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:58:51.515548 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f7244af_0f0f_4ec7_9a81_fd42125846a4.slice/crio-3083a2b9bf24c877953888e084e671afc5815695afdeb57b45096c476644704d WatchSource:0}: Error finding container 3083a2b9bf24c877953888e084e671afc5815695afdeb57b45096c476644704d: Status 404 returned error can't find the container with id 3083a2b9bf24c877953888e084e671afc5815695afdeb57b45096c476644704d Apr 22 19:58:51.566911 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.566879 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:51.570021 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.569835 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:51.677857 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.677451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:51.686266 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.686245 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/802bd93c-03cf-435c-a223-487ff037f6c7-metrics-certs\") pod \"network-metrics-daemon-xbxhx\" (UID: \"802bd93c-03cf-435c-a223-487ff037f6c7\") " pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:51.783876 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.783843 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qxcj4\"" Apr 22 19:58:51.791883 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.791857 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbxhx" Apr 22 19:58:51.949789 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:51.949753 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xbxhx"] Apr 22 19:58:51.956297 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:58:51.956236 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod802bd93c_03cf_435c_a223_487ff037f6c7.slice/crio-ada5b146deca2b6d2c2d06ab16cf53465020dfa8cdfad350b3440801856ecc64 WatchSource:0}: Error finding container ada5b146deca2b6d2c2d06ab16cf53465020dfa8cdfad350b3440801856ecc64: Status 404 returned error can't find the container with id ada5b146deca2b6d2c2d06ab16cf53465020dfa8cdfad350b3440801856ecc64 Apr 22 19:58:52.397404 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:52.397366 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xbxhx" event={"ID":"802bd93c-03cf-435c-a223-487ff037f6c7","Type":"ContainerStarted","Data":"ada5b146deca2b6d2c2d06ab16cf53465020dfa8cdfad350b3440801856ecc64"} Apr 22 19:58:52.399891 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:52.399856 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fxfbt" event={"ID":"0f7244af-0f0f-4ec7-9a81-fd42125846a4","Type":"ContainerStarted","Data":"10e9bcdb23a5f0a4c7bcfa2f4ec6d88ed7a5e4751f8de56788b7d82282ddd836"} Apr 22 19:58:52.399891 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:52.399895 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fxfbt" event={"ID":"0f7244af-0f0f-4ec7-9a81-fd42125846a4","Type":"ContainerStarted","Data":"3083a2b9bf24c877953888e084e671afc5815695afdeb57b45096c476644704d"} Apr 22 19:58:52.400085 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:52.400069 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:52.401578 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:52.401549 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-748d997cd4-p2hd6" Apr 22 19:58:56.290597 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:56.290510 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:56.291079 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:56.290943 2578 scope.go:117] "RemoveContainer" containerID="e003022a2cc7a18ef4dba514b7f0b04916db06ce89ab7df165380cbee2dfe4f0" Apr 22 19:58:56.291134 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:56.291113 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-lzbxz_openshift-console-operator(10902c6d-77cd-4e80-86e7-28633566a0ee)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" podUID="10902c6d-77cd-4e80-86e7-28633566a0ee" Apr 22 19:58:57.420752 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.420714 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rgmkt" event={"ID":"3b42faf8-dfc9-477e-a74b-abcef44beb8e","Type":"ContainerStarted","Data":"7c41ca97e61859628884e2316edcc5288cab95990ab217a986d0d7541fec0fdb"} Apr 22 19:58:57.421247 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.420760 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rgmkt" event={"ID":"3b42faf8-dfc9-477e-a74b-abcef44beb8e","Type":"ContainerStarted","Data":"1c7690015c389e3ad5ccd76afd32de31c0a2e3cf2d7fb74e3301c69c682c8e4b"} Apr 22 19:58:57.421247 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.420859 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rgmkt" Apr 22 19:58:57.422237 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.422202 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8zwhn" event={"ID":"54257b88-b9cb-44e6-9885-78eb59be8c12","Type":"ContainerStarted","Data":"2b192eb9ec9faa44d89616e636bebe9843d851abfe3e3ff62c59aaf10d806a9a"} Apr 22 19:58:57.423635 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.423561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" event={"ID":"f0a675ce-7b40-4a85-8869-d492c9e0218d","Type":"ContainerStarted","Data":"25384050a35d92f93adc86927f16e3a0f6109f39bd90688e569b20353f13b247"} Apr 22 19:58:57.425407 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.425383 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xbxhx" event={"ID":"802bd93c-03cf-435c-a223-487ff037f6c7","Type":"ContainerStarted","Data":"5126daa0bef9ec2143b27bb05d2fdfc544311aee9506c4ca01c0f5b604c8a260"} Apr 22 19:58:57.425523 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.425423 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xbxhx" event={"ID":"802bd93c-03cf-435c-a223-487ff037f6c7","Type":"ContainerStarted","Data":"c136a7f7aee4178332e979f345311d22b0a15188809491877825e44f88507a7a"} Apr 22 19:58:57.426668 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.426642 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" event={"ID":"e05b401e-86d3-4f58-ba83-8727ba2b2682","Type":"ContainerStarted","Data":"049ea5ab8c651f55a8fd1c94d086c93faa81200aedd0dc45e51e4b3f6d23919d"} Apr 22 19:58:57.428234 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.428211 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" event={"ID":"1ac0aac7-46c9-42f6-8aaa-e626360e1faa","Type":"ContainerStarted","Data":"db0826289e4bea5be429582144adfe9990bff0079f1b4c07adc90b2c3541ddda"} Apr 22 19:58:57.428322 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.428243 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" event={"ID":"1ac0aac7-46c9-42f6-8aaa-e626360e1faa","Type":"ContainerStarted","Data":"b6ad102c6bd9dabab5a4246d8465359b206c1fcd9cdcd5b2dd658675fafc8e76"} Apr 22 19:58:57.429784 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.429764 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fxfbt" event={"ID":"0f7244af-0f0f-4ec7-9a81-fd42125846a4","Type":"ContainerStarted","Data":"39ac3200fb6321bfd1aa8fc721b290a1ed65d0502b785a6d6a8b1f9cc668f3ef"} Apr 22 19:58:57.441548 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.441479 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rgmkt" podStartSLOduration=33.808520903 podStartE2EDuration="39.44146786s" podCreationTimestamp="2026-04-22 19:58:18 +0000 UTC" firstStartedPulling="2026-04-22 19:58:50.705283676 +0000 UTC m=+65.236005669" lastFinishedPulling="2026-04-22 19:58:56.338230639 +0000 UTC m=+70.868952626" observedRunningTime="2026-04-22 19:58:57.440083424 +0000 UTC m=+71.970805435" watchObservedRunningTime="2026-04-22 19:58:57.44146786 +0000 UTC m=+71.972189868" Apr 22 19:58:57.493291 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.493246 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5lsr2" podStartSLOduration=47.231187475 podStartE2EDuration="52.493232317s" podCreationTimestamp="2026-04-22 19:58:05 +0000 UTC" firstStartedPulling="2026-04-22 19:58:51.052614114 +0000 UTC m=+65.583336113" lastFinishedPulling="2026-04-22 19:58:56.314658953 +0000 UTC m=+70.845380955" observedRunningTime="2026-04-22 19:58:57.491099955 +0000 UTC m=+72.021821964" watchObservedRunningTime="2026-04-22 19:58:57.493232317 +0000 UTC m=+72.023954326" Apr 22 19:58:57.493742 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.493707 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xbxhx" podStartSLOduration=67.084175509 podStartE2EDuration="1m11.493696256s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="2026-04-22 19:58:51.9588463 +0000 UTC m=+66.489568293" lastFinishedPulling="2026-04-22 19:58:56.368367039 +0000 UTC m=+70.899089040" observedRunningTime="2026-04-22 19:58:57.464937889 +0000 UTC m=+71.995659902" watchObservedRunningTime="2026-04-22 19:58:57.493696256 +0000 UTC m=+72.024418266" Apr 22 19:58:57.510745 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.510691 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-fvkt4" podStartSLOduration=54.252470152 podStartE2EDuration="59.510678714s" podCreationTimestamp="2026-04-22 19:57:58 +0000 UTC" firstStartedPulling="2026-04-22 19:58:50.741236676 +0000 UTC m=+65.271958663" lastFinishedPulling="2026-04-22 19:58:55.999445235 +0000 UTC m=+70.530167225" observedRunningTime="2026-04-22 19:58:57.51058264 +0000 UTC m=+72.041304650" watchObservedRunningTime="2026-04-22 19:58:57.510678714 +0000 UTC m=+72.041400723" Apr 22 19:58:57.529468 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.529423 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6vcjz" podStartSLOduration=53.944227181 podStartE2EDuration="59.529410082s" podCreationTimestamp="2026-04-22 19:57:58 +0000 UTC" firstStartedPulling="2026-04-22 19:58:50.753014691 +0000 UTC m=+65.283736693" lastFinishedPulling="2026-04-22 19:58:56.338197607 +0000 UTC m=+70.868919594" observedRunningTime="2026-04-22 19:58:57.527748454 +0000 UTC m=+72.058470473" watchObservedRunningTime="2026-04-22 19:58:57.529410082 +0000 UTC m=+72.060132091" Apr 22 19:58:57.557013 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:57.556966 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8zwhn" podStartSLOduration=34.608995741 podStartE2EDuration="39.556953807s" podCreationTimestamp="2026-04-22 19:58:18 +0000 UTC" firstStartedPulling="2026-04-22 19:58:51.051485996 +0000 UTC m=+65.582208000" lastFinishedPulling="2026-04-22 19:58:55.999444079 +0000 UTC m=+70.530166066" observedRunningTime="2026-04-22 19:58:57.555875016 +0000 UTC m=+72.086597017" watchObservedRunningTime="2026-04-22 19:58:57.556953807 +0000 UTC m=+72.087675813" Apr 22 19:58:58.306370 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:58.306342 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rktp2" Apr 22 19:58:58.678888 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:58.678858 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:58:58.679234 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:58.679217 2578 scope.go:117] "RemoveContainer" containerID="e003022a2cc7a18ef4dba514b7f0b04916db06ce89ab7df165380cbee2dfe4f0" Apr 22 19:58:58.679386 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:58:58.679370 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-lzbxz_openshift-console-operator(10902c6d-77cd-4e80-86e7-28633566a0ee)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" podUID="10902c6d-77cd-4e80-86e7-28633566a0ee" Apr 22 19:58:59.436935 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:59.436902 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fxfbt" event={"ID":"0f7244af-0f0f-4ec7-9a81-fd42125846a4","Type":"ContainerStarted","Data":"b05876d74c1c2a850331043f226625d1b1df2292feae2b2df0f27ae9bf9c1073"} Apr 22 19:58:59.455981 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:58:59.455941 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fxfbt" podStartSLOduration=2.738318641 podStartE2EDuration="9.455928486s" podCreationTimestamp="2026-04-22 19:58:50 +0000 UTC" firstStartedPulling="2026-04-22 19:58:51.678346896 +0000 UTC m=+66.209068883" lastFinishedPulling="2026-04-22 19:58:58.395956739 +0000 UTC m=+72.926678728" observedRunningTime="2026-04-22 19:58:59.454486169 +0000 UTC m=+73.985208191" watchObservedRunningTime="2026-04-22 19:58:59.455928486 +0000 UTC m=+73.986650495" Apr 22 19:59:04.507734 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.507600 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8wksb"] Apr 22 19:59:04.530435 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.530409 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.532940 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.532913 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:59:04.533053 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.532941 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-98m5q\"" Apr 22 19:59:04.533053 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.533039 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:59:04.533997 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.533976 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:59:04.534108 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.534038 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:59:04.680504 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.680459 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f49321b-5acd-4547-be8f-3070921da9ea-metrics-client-ca\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.680679 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.680509 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.680679 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.680548 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8f49321b-5acd-4547-be8f-3070921da9ea-root\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.680679 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.680587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f49321b-5acd-4547-be8f-3070921da9ea-sys\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.680679 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.680650 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-textfile\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.680836 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.680685 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgx25\" (UniqueName: \"kubernetes.io/projected/8f49321b-5acd-4547-be8f-3070921da9ea-kube-api-access-xgx25\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.680836 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.680727 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-accelerators-collector-config\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.680836 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.680762 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-wtmp\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.680927 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.680845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-tls\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.781873 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.781763 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-tls\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.781873 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.781832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f49321b-5acd-4547-be8f-3070921da9ea-metrics-client-ca\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.781873 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.781857 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.782145 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.781882 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8f49321b-5acd-4547-be8f-3070921da9ea-root\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.782145 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.781915 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f49321b-5acd-4547-be8f-3070921da9ea-sys\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.782145 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:59:04.781926 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:59:04.782145 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.781983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f49321b-5acd-4547-be8f-3070921da9ea-sys\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.782145 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:59:04.781995 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-tls podName:8f49321b-5acd-4547-be8f-3070921da9ea nodeName:}" failed. No retries permitted until 2026-04-22 19:59:05.281977596 +0000 UTC m=+79.812699587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-tls") pod "node-exporter-8wksb" (UID: "8f49321b-5acd-4547-be8f-3070921da9ea") : secret "node-exporter-tls" not found Apr 22 19:59:04.782145 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.781995 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8f49321b-5acd-4547-be8f-3070921da9ea-root\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.782145 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.782054 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-textfile\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.782145 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.782088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgx25\" (UniqueName: \"kubernetes.io/projected/8f49321b-5acd-4547-be8f-3070921da9ea-kube-api-access-xgx25\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.782145 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.782127 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-accelerators-collector-config\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.782539 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.782155 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-wtmp\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.782539 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.782366 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-wtmp\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.782539 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.782407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-textfile\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.782539 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.782514 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f49321b-5acd-4547-be8f-3070921da9ea-metrics-client-ca\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.782676 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.782632 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-accelerators-collector-config\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.784473 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.784434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:04.806600 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:04.806569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgx25\" (UniqueName: \"kubernetes.io/projected/8f49321b-5acd-4547-be8f-3070921da9ea-kube-api-access-xgx25\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:05.287633 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:05.287593 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-tls\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:05.290034 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:05.290009 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8f49321b-5acd-4547-be8f-3070921da9ea-node-exporter-tls\") pod \"node-exporter-8wksb\" (UID: \"8f49321b-5acd-4547-be8f-3070921da9ea\") " pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:05.440697 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:05.440658 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8wksb" Apr 22 19:59:05.450117 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:59:05.450082 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f49321b_5acd_4547_be8f_3070921da9ea.slice/crio-9c1c7dcac1adc5d11f4d13662f2b136dfeb89e12afbe7aba304bae7ca59bdf87 WatchSource:0}: Error finding container 9c1c7dcac1adc5d11f4d13662f2b136dfeb89e12afbe7aba304bae7ca59bdf87: Status 404 returned error can't find the container with id 9c1c7dcac1adc5d11f4d13662f2b136dfeb89e12afbe7aba304bae7ca59bdf87 Apr 22 19:59:06.456714 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:06.456620 2578 generic.go:358] "Generic (PLEG): container finished" podID="8f49321b-5acd-4547-be8f-3070921da9ea" containerID="8d1a7f96c56c7d60e2f9fcc6851cf8393e3bf5874c99393212b26f6ba52ce35d" exitCode=0 Apr 22 19:59:06.457237 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:06.456708 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8wksb" event={"ID":"8f49321b-5acd-4547-be8f-3070921da9ea","Type":"ContainerDied","Data":"8d1a7f96c56c7d60e2f9fcc6851cf8393e3bf5874c99393212b26f6ba52ce35d"} Apr 22 19:59:06.457237 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:06.456755 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8wksb" event={"ID":"8f49321b-5acd-4547-be8f-3070921da9ea","Type":"ContainerStarted","Data":"9c1c7dcac1adc5d11f4d13662f2b136dfeb89e12afbe7aba304bae7ca59bdf87"} Apr 22 19:59:07.434456 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:07.434428 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rgmkt" Apr 22 19:59:07.462151 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:07.462122 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8wksb" event={"ID":"8f49321b-5acd-4547-be8f-3070921da9ea","Type":"ContainerStarted","Data":"2c5b22537f1d994000e6c766cf4e739e1c394114a616ee46d550d66535eee625"} Apr 22 19:59:07.462151 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:07.462154 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8wksb" event={"ID":"8f49321b-5acd-4547-be8f-3070921da9ea","Type":"ContainerStarted","Data":"dfb00b2cf11c841f43cfbd401cef0ac0246b2394559e342e11a699bf7b9c34f4"} Apr 22 19:59:07.480924 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:07.480527 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8wksb" podStartSLOduration=2.767759605 podStartE2EDuration="3.480509616s" podCreationTimestamp="2026-04-22 19:59:04 +0000 UTC" firstStartedPulling="2026-04-22 19:59:05.451917397 +0000 UTC m=+79.982639384" lastFinishedPulling="2026-04-22 19:59:06.164667407 +0000 UTC m=+80.695389395" observedRunningTime="2026-04-22 19:59:07.47962463 +0000 UTC m=+82.010346652" watchObservedRunningTime="2026-04-22 19:59:07.480509616 +0000 UTC m=+82.011231626" Apr 22 19:59:09.042757 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:09.042721 2578 scope.go:117] "RemoveContainer" containerID="e003022a2cc7a18ef4dba514b7f0b04916db06ce89ab7df165380cbee2dfe4f0" Apr 22 19:59:09.043166 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:59:09.042915 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-lzbxz_openshift-console-operator(10902c6d-77cd-4e80-86e7-28633566a0ee)\"" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" podUID="10902c6d-77cd-4e80-86e7-28633566a0ee" Apr 22 19:59:12.403662 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:12.403626 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:59:14.647006 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:14.646968 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9"] Apr 22 19:59:24.042007 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:24.041978 2578 scope.go:117] "RemoveContainer" containerID="e003022a2cc7a18ef4dba514b7f0b04916db06ce89ab7df165380cbee2dfe4f0" Apr 22 19:59:24.512582 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:24.512549 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/2.log" Apr 22 19:59:24.512734 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:24.512647 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" event={"ID":"10902c6d-77cd-4e80-86e7-28633566a0ee","Type":"ContainerStarted","Data":"452f0b24396e3e76861bf30b2515a423dbc4206fbe9e29341de436ae0038fb75"} Apr 22 19:59:24.513025 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:24.512991 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:59:24.982552 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:24.982524 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-lzbxz" Apr 22 19:59:25.166689 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:25.166659 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-b4vgs"] Apr 22 19:59:25.170022 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:25.169999 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-b4vgs" Apr 22 19:59:25.172174 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:25.172147 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:59:25.172475 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:25.172458 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-b5sdg\"" Apr 22 19:59:25.172577 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:25.172474 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:59:25.181726 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:25.181694 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-b4vgs"] Apr 22 19:59:25.240526 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:25.240447 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxh58\" (UniqueName: \"kubernetes.io/projected/fb5e4789-8a1a-445c-aeaf-e55b1a760fb5-kube-api-access-zxh58\") pod \"downloads-6bcc868b7-b4vgs\" (UID: \"fb5e4789-8a1a-445c-aeaf-e55b1a760fb5\") " pod="openshift-console/downloads-6bcc868b7-b4vgs" Apr 22 19:59:25.341773 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:25.341737 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxh58\" (UniqueName: \"kubernetes.io/projected/fb5e4789-8a1a-445c-aeaf-e55b1a760fb5-kube-api-access-zxh58\") pod \"downloads-6bcc868b7-b4vgs\" (UID: \"fb5e4789-8a1a-445c-aeaf-e55b1a760fb5\") " pod="openshift-console/downloads-6bcc868b7-b4vgs" Apr 22 19:59:25.353172 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:25.353144 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxh58\" (UniqueName: \"kubernetes.io/projected/fb5e4789-8a1a-445c-aeaf-e55b1a760fb5-kube-api-access-zxh58\") pod \"downloads-6bcc868b7-b4vgs\" (UID: \"fb5e4789-8a1a-445c-aeaf-e55b1a760fb5\") " pod="openshift-console/downloads-6bcc868b7-b4vgs" Apr 22 19:59:25.479538 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:25.479503 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-b4vgs" Apr 22 19:59:25.604626 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:25.604596 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-b4vgs"] Apr 22 19:59:25.607961 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:59:25.607926 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb5e4789_8a1a_445c_aeaf_e55b1a760fb5.slice/crio-b86f636795e2826a3d1459e0ff903b18e1a3381aae51ce6852218baf0238a91d WatchSource:0}: Error finding container b86f636795e2826a3d1459e0ff903b18e1a3381aae51ce6852218baf0238a91d: Status 404 returned error can't find the container with id b86f636795e2826a3d1459e0ff903b18e1a3381aae51ce6852218baf0238a91d Apr 22 19:59:26.519688 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:26.519648 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-b4vgs" event={"ID":"fb5e4789-8a1a-445c-aeaf-e55b1a760fb5","Type":"ContainerStarted","Data":"b86f636795e2826a3d1459e0ff903b18e1a3381aae51ce6852218baf0238a91d"} Apr 22 19:59:32.538293 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:32.538193 2578 generic.go:358] "Generic (PLEG): container finished" podID="c79df87a-c734-4d84-b239-5cfbd4266788" containerID="c60e9425acd680c3e1a28d4bfb8fa68d19c071fc1938d13fb9ba03f129c6f0ab" exitCode=0 Apr 22 19:59:32.538293 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:32.538262 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-nbb5m" event={"ID":"c79df87a-c734-4d84-b239-5cfbd4266788","Type":"ContainerDied","Data":"c60e9425acd680c3e1a28d4bfb8fa68d19c071fc1938d13fb9ba03f129c6f0ab"} Apr 22 19:59:32.538914 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:32.538667 2578 scope.go:117] "RemoveContainer" containerID="c60e9425acd680c3e1a28d4bfb8fa68d19c071fc1938d13fb9ba03f129c6f0ab" Apr 22 19:59:33.527169 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:33.527137 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-748d997cd4-p2hd6_cd91dcdf-472d-4758-b11b-7e7b6d347fbd/router/0.log" Apr 22 19:59:33.533688 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:33.533629 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8zwhn_54257b88-b9cb-44e6-9885-78eb59be8c12/serve-healthcheck-canary/0.log" Apr 22 19:59:33.544233 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:33.544198 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-nbb5m" event={"ID":"c79df87a-c734-4d84-b239-5cfbd4266788","Type":"ContainerStarted","Data":"56a4970ecd2177735003199508d0989c8e13ca336c6005345602b20d71d249b4"} Apr 22 19:59:35.234680 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.234637 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b4d5bbc74-5dnn7"] Apr 22 19:59:35.238372 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.238346 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.243085 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.243059 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:59:35.243223 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.243136 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:59:35.243223 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.243065 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:59:35.243223 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.243065 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:59:35.243385 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.243240 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:59:35.243385 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.243292 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-nvt62\"" Apr 22 19:59:35.249163 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.247315 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b4d5bbc74-5dnn7"] Apr 22 19:59:35.335487 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.335453 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2srqn\" (UniqueName: \"kubernetes.io/projected/b1161772-3316-4843-8f3d-908f1b869176-kube-api-access-2srqn\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.335669 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.335498 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-service-ca\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.335669 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.335521 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-oauth-serving-cert\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.335669 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.335640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-console-config\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.335836 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.335682 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1161772-3316-4843-8f3d-908f1b869176-console-serving-cert\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.335836 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.335730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1161772-3316-4843-8f3d-908f1b869176-console-oauth-config\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.436657 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.436614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1161772-3316-4843-8f3d-908f1b869176-console-oauth-config\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.436657 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.436662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2srqn\" (UniqueName: \"kubernetes.io/projected/b1161772-3316-4843-8f3d-908f1b869176-kube-api-access-2srqn\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.436915 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.436693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-service-ca\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.436972 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.436913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-oauth-serving-cert\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.437027 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.437002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-console-config\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.437079 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.437043 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1161772-3316-4843-8f3d-908f1b869176-console-serving-cert\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.437652 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.437628 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-console-config\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.437760 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.437656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-service-ca\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.438019 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.437992 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-oauth-serving-cert\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.439578 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.439551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1161772-3316-4843-8f3d-908f1b869176-console-oauth-config\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.439671 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.439620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1161772-3316-4843-8f3d-908f1b869176-console-serving-cert\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.444608 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.444587 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2srqn\" (UniqueName: \"kubernetes.io/projected/b1161772-3316-4843-8f3d-908f1b869176-kube-api-access-2srqn\") pod \"console-7b4d5bbc74-5dnn7\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:35.550418 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:35.550328 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 19:59:39.668956 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:39.668908 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" podUID="e9ed2f86-635a-4b6a-bb4c-a1b309537b91" containerName="registry" containerID="cri-o://d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba" gracePeriod=30 Apr 22 19:59:41.384614 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.384578 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b4d5bbc74-5dnn7"] Apr 22 19:59:41.388675 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:59:41.388634 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1161772_3316_4843_8f3d_908f1b869176.slice/crio-7a8231876d6e39c3e488e177a2c20eca7216329d23410d09e63461a4efec9f3d WatchSource:0}: Error finding container 7a8231876d6e39c3e488e177a2c20eca7216329d23410d09e63461a4efec9f3d: Status 404 returned error can't find the container with id 7a8231876d6e39c3e488e177a2c20eca7216329d23410d09e63461a4efec9f3d Apr 22 19:59:41.405691 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.405666 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:59:41.491597 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.491500 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-certificates\") pod \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " Apr 22 19:59:41.491597 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.491595 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-trusted-ca\") pod \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " Apr 22 19:59:41.491853 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.491627 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls\") pod \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " Apr 22 19:59:41.491853 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.491657 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-image-registry-private-configuration\") pod \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " Apr 22 19:59:41.491853 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.491687 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-ca-trust-extracted\") pod \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " Apr 22 19:59:41.491853 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.491733 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-476gp\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-kube-api-access-476gp\") pod \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " Apr 22 19:59:41.491853 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.491767 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-bound-sa-token\") pod \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " Apr 22 19:59:41.491853 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.491790 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-installation-pull-secrets\") pod \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\" (UID: \"e9ed2f86-635a-4b6a-bb4c-a1b309537b91\") " Apr 22 19:59:41.492216 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.492020 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e9ed2f86-635a-4b6a-bb4c-a1b309537b91" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:59:41.492216 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.492007 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e9ed2f86-635a-4b6a-bb4c-a1b309537b91" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:59:41.494642 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.494605 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e9ed2f86-635a-4b6a-bb4c-a1b309537b91" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:59:41.494642 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.494638 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e9ed2f86-635a-4b6a-bb4c-a1b309537b91" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:59:41.494853 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.494615 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-kube-api-access-476gp" (OuterVolumeSpecName: "kube-api-access-476gp") pod "e9ed2f86-635a-4b6a-bb4c-a1b309537b91" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91"). InnerVolumeSpecName "kube-api-access-476gp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:59:41.494908 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.494890 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e9ed2f86-635a-4b6a-bb4c-a1b309537b91" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:59:41.494943 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.494920 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e9ed2f86-635a-4b6a-bb4c-a1b309537b91" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:59:41.502925 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.502769 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e9ed2f86-635a-4b6a-bb4c-a1b309537b91" (UID: "e9ed2f86-635a-4b6a-bb4c-a1b309537b91"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:59:41.571210 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.571175 2578 generic.go:358] "Generic (PLEG): container finished" podID="e9ed2f86-635a-4b6a-bb4c-a1b309537b91" containerID="d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba" exitCode=0 Apr 22 19:59:41.571379 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.571247 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" Apr 22 19:59:41.571379 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.571265 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" event={"ID":"e9ed2f86-635a-4b6a-bb4c-a1b309537b91","Type":"ContainerDied","Data":"d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba"} Apr 22 19:59:41.571379 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.571310 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9" event={"ID":"e9ed2f86-635a-4b6a-bb4c-a1b309537b91","Type":"ContainerDied","Data":"1b13bc2701b255a6baee906b0df64c97f2940aca99e4e1013917c0f435b473a5"} Apr 22 19:59:41.571379 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.571330 2578 scope.go:117] "RemoveContainer" containerID="d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba" Apr 22 19:59:41.572908 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.572837 2578 generic.go:358] "Generic (PLEG): container finished" podID="b589acde-4c45-4c44-be29-d3785bec1ccd" containerID="56582f7b1171be02c38bdb2f860fbe56279dcf33737b2768653184d88212df94" exitCode=0 Apr 22 19:59:41.573023 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.572936 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" event={"ID":"b589acde-4c45-4c44-be29-d3785bec1ccd","Type":"ContainerDied","Data":"56582f7b1171be02c38bdb2f860fbe56279dcf33737b2768653184d88212df94"} Apr 22 19:59:41.573298 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.573276 2578 scope.go:117] "RemoveContainer" containerID="56582f7b1171be02c38bdb2f860fbe56279dcf33737b2768653184d88212df94" Apr 22 19:59:41.575626 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.575602 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-b4vgs" event={"ID":"fb5e4789-8a1a-445c-aeaf-e55b1a760fb5","Type":"ContainerStarted","Data":"22f2d6c41c3e6277c3db9f04f643a439b819309377bf4f1965e838c3cce24c4d"} Apr 22 19:59:41.576161 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.576107 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-b4vgs" Apr 22 19:59:41.577055 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.577021 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4d5bbc74-5dnn7" event={"ID":"b1161772-3316-4843-8f3d-908f1b869176","Type":"ContainerStarted","Data":"7a8231876d6e39c3e488e177a2c20eca7216329d23410d09e63461a4efec9f3d"} Apr 22 19:59:41.577635 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.577611 2578 patch_prober.go:28] interesting pod/downloads-6bcc868b7-b4vgs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.132.0.21:8080/\": dial tcp 10.132.0.21:8080: connect: connection refused" start-of-body= Apr 22 19:59:41.577734 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.577660 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-b4vgs" podUID="fb5e4789-8a1a-445c-aeaf-e55b1a760fb5" containerName="download-server" probeResult="failure" output="Get \"http://10.132.0.21:8080/\": dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 19:59:41.585220 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.585193 2578 scope.go:117] "RemoveContainer" containerID="d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba" Apr 22 19:59:41.585626 ip-10-0-135-72 kubenswrapper[2578]: E0422 19:59:41.585593 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba\": container with ID starting with d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba not found: ID does not exist" containerID="d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba" Apr 22 19:59:41.585742 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.585634 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba"} err="failed to get container status \"d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba\": rpc error: code = NotFound desc = could not find container \"d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba\": container with ID starting with d1ddd23b2b51a99d53a7077e91e2ec57b21be2c1a61883170832708dbd2c75ba not found: ID does not exist" Apr 22 19:59:41.593088 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.592920 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-certificates\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 19:59:41.593088 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.593050 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-trusted-ca\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 19:59:41.593088 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.593088 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-registry-tls\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 19:59:41.593286 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.593105 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-image-registry-private-configuration\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 19:59:41.593286 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.593118 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-ca-trust-extracted\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 19:59:41.593286 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.593127 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-476gp\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-kube-api-access-476gp\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 19:59:41.593286 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.593136 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-bound-sa-token\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 19:59:41.593286 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.593145 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e9ed2f86-635a-4b6a-bb4c-a1b309537b91-installation-pull-secrets\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 19:59:41.606052 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.605675 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9"] Apr 22 19:59:41.612102 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.612027 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6bdfcbd6fd-q8mx9"] Apr 22 19:59:41.631152 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:41.631095 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-b4vgs" podStartSLOduration=0.91804891 podStartE2EDuration="16.631074071s" podCreationTimestamp="2026-04-22 19:59:25 +0000 UTC" firstStartedPulling="2026-04-22 19:59:25.609867749 +0000 UTC m=+100.140589750" lastFinishedPulling="2026-04-22 19:59:41.322892921 +0000 UTC m=+115.853614911" observedRunningTime="2026-04-22 19:59:41.630473578 +0000 UTC m=+116.161195588" watchObservedRunningTime="2026-04-22 19:59:41.631074071 +0000 UTC m=+116.161796082" Apr 22 19:59:42.048620 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:42.048134 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ed2f86-635a-4b6a-bb4c-a1b309537b91" path="/var/lib/kubelet/pods/e9ed2f86-635a-4b6a-bb4c-a1b309537b91/volumes" Apr 22 19:59:42.583679 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:42.583631 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lgx98" event={"ID":"b589acde-4c45-4c44-be29-d3785bec1ccd","Type":"ContainerStarted","Data":"bdc61ebed22db741383efa00182563083154902536205ba95a5dd18ef749f514"} Apr 22 19:59:42.594297 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:42.594265 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-b4vgs" Apr 22 19:59:44.073827 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.073769 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56d957bcf7-rl7nv"] Apr 22 19:59:44.074258 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.074236 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9ed2f86-635a-4b6a-bb4c-a1b309537b91" containerName="registry" Apr 22 19:59:44.074431 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.074413 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ed2f86-635a-4b6a-bb4c-a1b309537b91" containerName="registry" Apr 22 19:59:44.074551 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.074536 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9ed2f86-635a-4b6a-bb4c-a1b309537b91" containerName="registry" Apr 22 19:59:44.092598 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.092548 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56d957bcf7-rl7nv"] Apr 22 19:59:44.092798 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.092693 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.100722 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.100689 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 19:59:44.219555 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.219180 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-service-ca\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.219555 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.219236 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-oauth-serving-cert\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.219555 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.219265 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vhpq\" (UniqueName: \"kubernetes.io/projected/aa998dbf-0f45-48ce-b059-429bc1dd3da9-kube-api-access-8vhpq\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.219555 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.219302 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-oauth-config\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.219555 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.219355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-serving-cert\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.219555 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.219407 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-trusted-ca-bundle\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.219555 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.219440 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-config\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.321313 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.320243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-oauth-serving-cert\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.321313 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.320292 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vhpq\" (UniqueName: \"kubernetes.io/projected/aa998dbf-0f45-48ce-b059-429bc1dd3da9-kube-api-access-8vhpq\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.321313 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.320335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-oauth-config\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.321313 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.320388 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-serving-cert\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.321313 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.320444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-trusted-ca-bundle\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.321313 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.320477 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-config\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.321313 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.320513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-service-ca\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.321313 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.321147 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-oauth-serving-cert\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.321313 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.321266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-service-ca\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.321999 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.321978 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-config\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.322716 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.322692 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-trusted-ca-bundle\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.323986 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.323930 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-oauth-config\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.325398 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.325375 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-serving-cert\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.331294 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.331266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vhpq\" (UniqueName: \"kubernetes.io/projected/aa998dbf-0f45-48ce-b059-429bc1dd3da9-kube-api-access-8vhpq\") pod \"console-56d957bcf7-rl7nv\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.407456 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.407414 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:44.832979 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:44.832901 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56d957bcf7-rl7nv"] Apr 22 19:59:44.849216 ip-10-0-135-72 kubenswrapper[2578]: W0422 19:59:44.849100 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa998dbf_0f45_48ce_b059_429bc1dd3da9.slice/crio-6d78173472a2dddd2cfba1611ad6d7454e8d4b2e307da1ae6976abf4b81363fd WatchSource:0}: Error finding container 6d78173472a2dddd2cfba1611ad6d7454e8d4b2e307da1ae6976abf4b81363fd: Status 404 returned error can't find the container with id 6d78173472a2dddd2cfba1611ad6d7454e8d4b2e307da1ae6976abf4b81363fd Apr 22 19:59:45.600383 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:45.600347 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4d5bbc74-5dnn7" event={"ID":"b1161772-3316-4843-8f3d-908f1b869176","Type":"ContainerStarted","Data":"057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553"} Apr 22 19:59:45.602941 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:45.602906 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d957bcf7-rl7nv" event={"ID":"aa998dbf-0f45-48ce-b059-429bc1dd3da9","Type":"ContainerStarted","Data":"e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb"} Apr 22 19:59:45.603146 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:45.603127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d957bcf7-rl7nv" event={"ID":"aa998dbf-0f45-48ce-b059-429bc1dd3da9","Type":"ContainerStarted","Data":"6d78173472a2dddd2cfba1611ad6d7454e8d4b2e307da1ae6976abf4b81363fd"} Apr 22 19:59:45.618163 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:45.618084 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b4d5bbc74-5dnn7" podStartSLOduration=6.875651011 podStartE2EDuration="10.618064226s" podCreationTimestamp="2026-04-22 19:59:35 +0000 UTC" firstStartedPulling="2026-04-22 19:59:41.391082256 +0000 UTC m=+115.921804244" lastFinishedPulling="2026-04-22 19:59:45.133495472 +0000 UTC m=+119.664217459" observedRunningTime="2026-04-22 19:59:45.615797168 +0000 UTC m=+120.146519177" watchObservedRunningTime="2026-04-22 19:59:45.618064226 +0000 UTC m=+120.148786238" Apr 22 19:59:45.632483 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:45.632428 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56d957bcf7-rl7nv" podStartSLOduration=1.028627505 podStartE2EDuration="1.632410762s" podCreationTimestamp="2026-04-22 19:59:44 +0000 UTC" firstStartedPulling="2026-04-22 19:59:44.852194832 +0000 UTC m=+119.382916821" lastFinishedPulling="2026-04-22 19:59:45.455978087 +0000 UTC m=+119.986700078" observedRunningTime="2026-04-22 19:59:45.630431898 +0000 UTC m=+120.161153907" watchObservedRunningTime="2026-04-22 19:59:45.632410762 +0000 UTC m=+120.163132771" Apr 22 19:59:52.628507 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:52.628457 2578 generic.go:358] "Generic (PLEG): container finished" podID="99eb7f40-e81e-4454-b333-f70327da668c" containerID="949f0160f9feae05cf45475df8ae4beacf256dc42525838ec54790a0b8487c5d" exitCode=0 Apr 22 19:59:52.629014 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:52.628515 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" event={"ID":"99eb7f40-e81e-4454-b333-f70327da668c","Type":"ContainerDied","Data":"949f0160f9feae05cf45475df8ae4beacf256dc42525838ec54790a0b8487c5d"} Apr 22 19:59:52.629014 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:52.628868 2578 scope.go:117] "RemoveContainer" containerID="949f0160f9feae05cf45475df8ae4beacf256dc42525838ec54790a0b8487c5d" Apr 22 19:59:53.633300 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:53.633264 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hdqk8" event={"ID":"99eb7f40-e81e-4454-b333-f70327da668c","Type":"ContainerStarted","Data":"4f719fba27df3217d57767f34e8d709cb1076b887cdaef14a72176bb3a4b1311"} Apr 22 19:59:54.407635 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:54.407596 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:54.407635 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:54.407647 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:54.412392 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:54.412369 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:54.640334 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:54.640295 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 19:59:54.685843 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:54.685753 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b4d5bbc74-5dnn7"] Apr 22 19:59:55.551307 ip-10-0-135-72 kubenswrapper[2578]: I0422 19:59:55.551270 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 20:00:19.712589 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:19.712530 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b4d5bbc74-5dnn7" podUID="b1161772-3316-4843-8f3d-908f1b869176" containerName="console" containerID="cri-o://057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553" gracePeriod=15 Apr 22 20:00:20.025694 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.025664 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b4d5bbc74-5dnn7_b1161772-3316-4843-8f3d-908f1b869176/console/0.log" Apr 22 20:00:20.025853 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.025726 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 20:00:20.198920 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.198883 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2srqn\" (UniqueName: \"kubernetes.io/projected/b1161772-3316-4843-8f3d-908f1b869176-kube-api-access-2srqn\") pod \"b1161772-3316-4843-8f3d-908f1b869176\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " Apr 22 20:00:20.198920 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.198923 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-oauth-serving-cert\") pod \"b1161772-3316-4843-8f3d-908f1b869176\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " Apr 22 20:00:20.199154 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.198938 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-console-config\") pod \"b1161772-3316-4843-8f3d-908f1b869176\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " Apr 22 20:00:20.199154 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.198978 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-service-ca\") pod \"b1161772-3316-4843-8f3d-908f1b869176\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " Apr 22 20:00:20.199154 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.199077 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1161772-3316-4843-8f3d-908f1b869176-console-serving-cert\") pod \"b1161772-3316-4843-8f3d-908f1b869176\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " Apr 22 20:00:20.199154 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.199150 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1161772-3316-4843-8f3d-908f1b869176-console-oauth-config\") pod \"b1161772-3316-4843-8f3d-908f1b869176\" (UID: \"b1161772-3316-4843-8f3d-908f1b869176\") " Apr 22 20:00:20.199395 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.199359 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b1161772-3316-4843-8f3d-908f1b869176" (UID: "b1161772-3316-4843-8f3d-908f1b869176"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:20.199395 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.199382 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-console-config" (OuterVolumeSpecName: "console-config") pod "b1161772-3316-4843-8f3d-908f1b869176" (UID: "b1161772-3316-4843-8f3d-908f1b869176"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:20.199395 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.199388 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-service-ca" (OuterVolumeSpecName: "service-ca") pod "b1161772-3316-4843-8f3d-908f1b869176" (UID: "b1161772-3316-4843-8f3d-908f1b869176"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:20.201512 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.201483 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1161772-3316-4843-8f3d-908f1b869176-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b1161772-3316-4843-8f3d-908f1b869176" (UID: "b1161772-3316-4843-8f3d-908f1b869176"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:20.201512 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.201483 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1161772-3316-4843-8f3d-908f1b869176-kube-api-access-2srqn" (OuterVolumeSpecName: "kube-api-access-2srqn") pod "b1161772-3316-4843-8f3d-908f1b869176" (UID: "b1161772-3316-4843-8f3d-908f1b869176"). InnerVolumeSpecName "kube-api-access-2srqn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:20.201633 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.201515 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1161772-3316-4843-8f3d-908f1b869176-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b1161772-3316-4843-8f3d-908f1b869176" (UID: "b1161772-3316-4843-8f3d-908f1b869176"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:20.300305 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.300226 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b1161772-3316-4843-8f3d-908f1b869176-console-oauth-config\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:00:20.300305 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.300253 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2srqn\" (UniqueName: \"kubernetes.io/projected/b1161772-3316-4843-8f3d-908f1b869176-kube-api-access-2srqn\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:00:20.300305 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.300262 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-oauth-serving-cert\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:00:20.300305 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.300271 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-console-config\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:00:20.300305 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.300280 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1161772-3316-4843-8f3d-908f1b869176-service-ca\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:00:20.300305 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.300288 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1161772-3316-4843-8f3d-908f1b869176-console-serving-cert\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:00:20.714086 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.714056 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b4d5bbc74-5dnn7_b1161772-3316-4843-8f3d-908f1b869176/console/0.log" Apr 22 20:00:20.714485 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.714099 2578 generic.go:358] "Generic (PLEG): container finished" podID="b1161772-3316-4843-8f3d-908f1b869176" containerID="057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553" exitCode=2 Apr 22 20:00:20.714485 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.714176 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4d5bbc74-5dnn7" Apr 22 20:00:20.714485 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.714174 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4d5bbc74-5dnn7" event={"ID":"b1161772-3316-4843-8f3d-908f1b869176","Type":"ContainerDied","Data":"057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553"} Apr 22 20:00:20.714485 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.714285 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4d5bbc74-5dnn7" event={"ID":"b1161772-3316-4843-8f3d-908f1b869176","Type":"ContainerDied","Data":"7a8231876d6e39c3e488e177a2c20eca7216329d23410d09e63461a4efec9f3d"} Apr 22 20:00:20.714485 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.714304 2578 scope.go:117] "RemoveContainer" containerID="057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553" Apr 22 20:00:20.723568 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.723545 2578 scope.go:117] "RemoveContainer" containerID="057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553" Apr 22 20:00:20.723894 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:00:20.723872 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553\": container with ID starting with 057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553 not found: ID does not exist" containerID="057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553" Apr 22 20:00:20.723988 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.723899 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553"} err="failed to get container status \"057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553\": rpc error: code = NotFound desc = could not find container \"057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553\": container with ID starting with 057102b560f3ab96bd868982ab38f8fd9214900857d6717ad5c26efe45214553 not found: ID does not exist" Apr 22 20:00:20.734624 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.734595 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b4d5bbc74-5dnn7"] Apr 22 20:00:20.738511 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:20.738486 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b4d5bbc74-5dnn7"] Apr 22 20:00:22.050662 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:22.050614 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1161772-3316-4843-8f3d-908f1b869176" path="/var/lib/kubelet/pods/b1161772-3316-4843-8f3d-908f1b869176/volumes" Apr 22 20:00:25.166961 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.166920 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-fc7856f6d-84ncp"] Apr 22 20:00:25.167516 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.167335 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1161772-3316-4843-8f3d-908f1b869176" containerName="console" Apr 22 20:00:25.167516 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.167365 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1161772-3316-4843-8f3d-908f1b869176" containerName="console" Apr 22 20:00:25.167516 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.167463 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1161772-3316-4843-8f3d-908f1b869176" containerName="console" Apr 22 20:00:25.171883 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.171858 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.180881 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.180861 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fc7856f6d-84ncp"] Apr 22 20:00:25.237802 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.237771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-config\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.237979 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.237842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-oauth-config\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.237979 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.237879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-serving-cert\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.237979 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.237898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-trusted-ca-bundle\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.237979 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.237967 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-service-ca\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.238159 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.238008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-oauth-serving-cert\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.238159 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.238024 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnk9r\" (UniqueName: \"kubernetes.io/projected/57575df9-d3dc-4c92-9e55-31a28b27a20c-kube-api-access-pnk9r\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.339118 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.339087 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-service-ca\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.339306 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.339132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-oauth-serving-cert\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.339306 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.339148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnk9r\" (UniqueName: \"kubernetes.io/projected/57575df9-d3dc-4c92-9e55-31a28b27a20c-kube-api-access-pnk9r\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.339423 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.339363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-config\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.339479 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.339438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-oauth-config\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.339533 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.339489 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-serving-cert\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.339533 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.339519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-trusted-ca-bundle\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.339964 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.339925 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-service-ca\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.339964 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.339952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-oauth-serving-cert\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.340139 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.340035 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-config\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.340549 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.340528 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-trusted-ca-bundle\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.342075 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.342055 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-serving-cert\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.342075 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.342072 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-oauth-config\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.348240 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.348221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnk9r\" (UniqueName: \"kubernetes.io/projected/57575df9-d3dc-4c92-9e55-31a28b27a20c-kube-api-access-pnk9r\") pod \"console-fc7856f6d-84ncp\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.482573 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.482494 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:25.606622 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.606584 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fc7856f6d-84ncp"] Apr 22 20:00:25.609655 ip-10-0-135-72 kubenswrapper[2578]: W0422 20:00:25.609629 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57575df9_d3dc_4c92_9e55_31a28b27a20c.slice/crio-401d0529f26396e5b33bfee1387d8a5b82d721ed9e8106c9ca786f6959b482ab WatchSource:0}: Error finding container 401d0529f26396e5b33bfee1387d8a5b82d721ed9e8106c9ca786f6959b482ab: Status 404 returned error can't find the container with id 401d0529f26396e5b33bfee1387d8a5b82d721ed9e8106c9ca786f6959b482ab Apr 22 20:00:25.730469 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.730438 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fc7856f6d-84ncp" event={"ID":"57575df9-d3dc-4c92-9e55-31a28b27a20c","Type":"ContainerStarted","Data":"cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2"} Apr 22 20:00:25.730469 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.730471 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fc7856f6d-84ncp" event={"ID":"57575df9-d3dc-4c92-9e55-31a28b27a20c","Type":"ContainerStarted","Data":"401d0529f26396e5b33bfee1387d8a5b82d721ed9e8106c9ca786f6959b482ab"} Apr 22 20:00:25.748869 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:25.748761 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-fc7856f6d-84ncp" podStartSLOduration=0.74874371 podStartE2EDuration="748.74371ms" podCreationTimestamp="2026-04-22 20:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:00:25.748189549 +0000 UTC m=+160.278911557" watchObservedRunningTime="2026-04-22 20:00:25.74874371 +0000 UTC m=+160.279465719" Apr 22 20:00:35.483366 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:35.483289 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:35.483366 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:35.483325 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:35.488544 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:35.488521 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:35.760496 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:35.760425 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:00:35.805446 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:00:35.805413 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56d957bcf7-rl7nv"] Apr 22 20:01:00.827927 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:00.827871 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56d957bcf7-rl7nv" podUID="aa998dbf-0f45-48ce-b059-429bc1dd3da9" containerName="console" containerID="cri-o://e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb" gracePeriod=15 Apr 22 20:01:01.067734 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.067714 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56d957bcf7-rl7nv_aa998dbf-0f45-48ce-b059-429bc1dd3da9/console/0.log" Apr 22 20:01:01.067847 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.067774 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 20:01:01.233735 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.233700 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-oauth-serving-cert\") pod \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " Apr 22 20:01:01.233960 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.233741 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-serving-cert\") pod \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " Apr 22 20:01:01.233960 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.233770 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-service-ca\") pod \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " Apr 22 20:01:01.233960 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.233802 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-trusted-ca-bundle\") pod \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " Apr 22 20:01:01.233960 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.233896 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-config\") pod \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " Apr 22 20:01:01.233960 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.233928 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vhpq\" (UniqueName: \"kubernetes.io/projected/aa998dbf-0f45-48ce-b059-429bc1dd3da9-kube-api-access-8vhpq\") pod \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " Apr 22 20:01:01.233960 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.233954 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-oauth-config\") pod \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\" (UID: \"aa998dbf-0f45-48ce-b059-429bc1dd3da9\") " Apr 22 20:01:01.234297 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.234265 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aa998dbf-0f45-48ce-b059-429bc1dd3da9" (UID: "aa998dbf-0f45-48ce-b059-429bc1dd3da9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:01.234423 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.234377 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "aa998dbf-0f45-48ce-b059-429bc1dd3da9" (UID: "aa998dbf-0f45-48ce-b059-429bc1dd3da9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:01.234423 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.234404 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-service-ca" (OuterVolumeSpecName: "service-ca") pod "aa998dbf-0f45-48ce-b059-429bc1dd3da9" (UID: "aa998dbf-0f45-48ce-b059-429bc1dd3da9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:01.234568 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.234444 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-config" (OuterVolumeSpecName: "console-config") pod "aa998dbf-0f45-48ce-b059-429bc1dd3da9" (UID: "aa998dbf-0f45-48ce-b059-429bc1dd3da9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:01.236224 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.236194 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aa998dbf-0f45-48ce-b059-429bc1dd3da9" (UID: "aa998dbf-0f45-48ce-b059-429bc1dd3da9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:01.236325 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.236245 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa998dbf-0f45-48ce-b059-429bc1dd3da9-kube-api-access-8vhpq" (OuterVolumeSpecName: "kube-api-access-8vhpq") pod "aa998dbf-0f45-48ce-b059-429bc1dd3da9" (UID: "aa998dbf-0f45-48ce-b059-429bc1dd3da9"). InnerVolumeSpecName "kube-api-access-8vhpq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:01:01.236325 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.236293 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aa998dbf-0f45-48ce-b059-429bc1dd3da9" (UID: "aa998dbf-0f45-48ce-b059-429bc1dd3da9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:01.334595 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.334560 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-oauth-serving-cert\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:01:01.334595 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.334590 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-serving-cert\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:01:01.334595 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.334601 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-service-ca\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:01:01.334841 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.334610 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-trusted-ca-bundle\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:01:01.334841 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.334619 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-config\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:01:01.334841 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.334628 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8vhpq\" (UniqueName: \"kubernetes.io/projected/aa998dbf-0f45-48ce-b059-429bc1dd3da9-kube-api-access-8vhpq\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:01:01.334841 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.334637 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa998dbf-0f45-48ce-b059-429bc1dd3da9-console-oauth-config\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:01:01.831151 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.831118 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56d957bcf7-rl7nv_aa998dbf-0f45-48ce-b059-429bc1dd3da9/console/0.log" Apr 22 20:01:01.831524 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.831161 2578 generic.go:358] "Generic (PLEG): container finished" podID="aa998dbf-0f45-48ce-b059-429bc1dd3da9" containerID="e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb" exitCode=2 Apr 22 20:01:01.831524 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.831241 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56d957bcf7-rl7nv" Apr 22 20:01:01.831524 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.831255 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d957bcf7-rl7nv" event={"ID":"aa998dbf-0f45-48ce-b059-429bc1dd3da9","Type":"ContainerDied","Data":"e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb"} Apr 22 20:01:01.831524 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.831296 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56d957bcf7-rl7nv" event={"ID":"aa998dbf-0f45-48ce-b059-429bc1dd3da9","Type":"ContainerDied","Data":"6d78173472a2dddd2cfba1611ad6d7454e8d4b2e307da1ae6976abf4b81363fd"} Apr 22 20:01:01.831524 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.831313 2578 scope.go:117] "RemoveContainer" containerID="e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb" Apr 22 20:01:01.839299 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.839283 2578 scope.go:117] "RemoveContainer" containerID="e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb" Apr 22 20:01:01.839550 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:01.839533 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb\": container with ID starting with e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb not found: ID does not exist" containerID="e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb" Apr 22 20:01:01.839593 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.839558 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb"} err="failed to get container status \"e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb\": rpc error: code = NotFound desc = could not find container \"e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb\": container with ID starting with e526655108bc3912e14ff9e231df1ab6ceff0e12636c926c343c644fcd511bdb not found: ID does not exist" Apr 22 20:01:01.850108 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.850085 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56d957bcf7-rl7nv"] Apr 22 20:01:01.854247 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:01.854227 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56d957bcf7-rl7nv"] Apr 22 20:01:02.046266 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:02.046222 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa998dbf-0f45-48ce-b059-429bc1dd3da9" path="/var/lib/kubelet/pods/aa998dbf-0f45-48ce-b059-429bc1dd3da9/volumes" Apr 22 20:01:20.364899 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.364863 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt"] Apr 22 20:01:20.365306 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.365152 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa998dbf-0f45-48ce-b059-429bc1dd3da9" containerName="console" Apr 22 20:01:20.365306 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.365163 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa998dbf-0f45-48ce-b059-429bc1dd3da9" containerName="console" Apr 22 20:01:20.365306 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.365214 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa998dbf-0f45-48ce-b059-429bc1dd3da9" containerName="console" Apr 22 20:01:20.368175 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.368157 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:20.370429 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.370407 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-q25rl\"" Apr 22 20:01:20.370535 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.370408 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 20:01:20.371284 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.371268 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 20:01:20.374139 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.374119 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b93dd91-7115-4f80-b33c-368a895deda9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt\" (UID: \"2b93dd91-7115-4f80-b33c-368a895deda9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:20.374248 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.374167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v776f\" (UniqueName: \"kubernetes.io/projected/2b93dd91-7115-4f80-b33c-368a895deda9-kube-api-access-v776f\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt\" (UID: \"2b93dd91-7115-4f80-b33c-368a895deda9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:20.374248 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.374219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b93dd91-7115-4f80-b33c-368a895deda9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt\" (UID: \"2b93dd91-7115-4f80-b33c-368a895deda9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:20.376933 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.376902 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt"] Apr 22 20:01:20.474721 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.474678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b93dd91-7115-4f80-b33c-368a895deda9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt\" (UID: \"2b93dd91-7115-4f80-b33c-368a895deda9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:20.474943 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.474735 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b93dd91-7115-4f80-b33c-368a895deda9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt\" (UID: \"2b93dd91-7115-4f80-b33c-368a895deda9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:20.474943 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.474764 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v776f\" (UniqueName: \"kubernetes.io/projected/2b93dd91-7115-4f80-b33c-368a895deda9-kube-api-access-v776f\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt\" (UID: \"2b93dd91-7115-4f80-b33c-368a895deda9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:20.475084 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.475061 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b93dd91-7115-4f80-b33c-368a895deda9-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt\" (UID: \"2b93dd91-7115-4f80-b33c-368a895deda9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:20.475171 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.475152 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b93dd91-7115-4f80-b33c-368a895deda9-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt\" (UID: \"2b93dd91-7115-4f80-b33c-368a895deda9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:20.485383 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.485361 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v776f\" (UniqueName: \"kubernetes.io/projected/2b93dd91-7115-4f80-b33c-368a895deda9-kube-api-access-v776f\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt\" (UID: \"2b93dd91-7115-4f80-b33c-368a895deda9\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:20.678076 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.677993 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:20.797382 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.797355 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt"] Apr 22 20:01:20.799915 ip-10-0-135-72 kubenswrapper[2578]: W0422 20:01:20.799885 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b93dd91_7115_4f80_b33c_368a895deda9.slice/crio-700189b05b8d37d738652624697f47420f9658a6dab6732dae9f504fb7fdfff9 WatchSource:0}: Error finding container 700189b05b8d37d738652624697f47420f9658a6dab6732dae9f504fb7fdfff9: Status 404 returned error can't find the container with id 700189b05b8d37d738652624697f47420f9658a6dab6732dae9f504fb7fdfff9 Apr 22 20:01:20.885436 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:20.885404 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" event={"ID":"2b93dd91-7115-4f80-b33c-368a895deda9","Type":"ContainerStarted","Data":"700189b05b8d37d738652624697f47420f9658a6dab6732dae9f504fb7fdfff9"} Apr 22 20:01:28.909412 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:28.909377 2578 generic.go:358] "Generic (PLEG): container finished" podID="2b93dd91-7115-4f80-b33c-368a895deda9" containerID="893d40d6ea89e8ae12d06671b428c663963de36d3ad117a2c97a389acec846f9" exitCode=0 Apr 22 20:01:28.909837 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:28.909477 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" event={"ID":"2b93dd91-7115-4f80-b33c-368a895deda9","Type":"ContainerDied","Data":"893d40d6ea89e8ae12d06671b428c663963de36d3ad117a2c97a389acec846f9"} Apr 22 20:01:31.920315 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:31.920273 2578 generic.go:358] "Generic (PLEG): container finished" podID="2b93dd91-7115-4f80-b33c-368a895deda9" containerID="b25bafca276cd0d996ef16927d3a6d3670c36ac837b444221922a5512e1486a0" exitCode=0 Apr 22 20:01:31.920787 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:31.920363 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" event={"ID":"2b93dd91-7115-4f80-b33c-368a895deda9","Type":"ContainerDied","Data":"b25bafca276cd0d996ef16927d3a6d3670c36ac837b444221922a5512e1486a0"} Apr 22 20:01:39.947514 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:39.947476 2578 generic.go:358] "Generic (PLEG): container finished" podID="2b93dd91-7115-4f80-b33c-368a895deda9" containerID="396469071c0ed62591c38a23b6d06e4ad3a16b86046217298d1656c80ba2f018" exitCode=0 Apr 22 20:01:39.947909 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:39.947558 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" event={"ID":"2b93dd91-7115-4f80-b33c-368a895deda9","Type":"ContainerDied","Data":"396469071c0ed62591c38a23b6d06e4ad3a16b86046217298d1656c80ba2f018"} Apr 22 20:01:41.073388 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.073364 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:41.149833 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.149757 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b93dd91-7115-4f80-b33c-368a895deda9-bundle\") pod \"2b93dd91-7115-4f80-b33c-368a895deda9\" (UID: \"2b93dd91-7115-4f80-b33c-368a895deda9\") " Apr 22 20:01:41.150030 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.149863 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b93dd91-7115-4f80-b33c-368a895deda9-util\") pod \"2b93dd91-7115-4f80-b33c-368a895deda9\" (UID: \"2b93dd91-7115-4f80-b33c-368a895deda9\") " Apr 22 20:01:41.150030 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.149966 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v776f\" (UniqueName: \"kubernetes.io/projected/2b93dd91-7115-4f80-b33c-368a895deda9-kube-api-access-v776f\") pod \"2b93dd91-7115-4f80-b33c-368a895deda9\" (UID: \"2b93dd91-7115-4f80-b33c-368a895deda9\") " Apr 22 20:01:41.150583 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.150493 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b93dd91-7115-4f80-b33c-368a895deda9-bundle" (OuterVolumeSpecName: "bundle") pod "2b93dd91-7115-4f80-b33c-368a895deda9" (UID: "2b93dd91-7115-4f80-b33c-368a895deda9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:01:41.152298 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.152271 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b93dd91-7115-4f80-b33c-368a895deda9-kube-api-access-v776f" (OuterVolumeSpecName: "kube-api-access-v776f") pod "2b93dd91-7115-4f80-b33c-368a895deda9" (UID: "2b93dd91-7115-4f80-b33c-368a895deda9"). InnerVolumeSpecName "kube-api-access-v776f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:01:41.154163 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.154139 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b93dd91-7115-4f80-b33c-368a895deda9-util" (OuterVolumeSpecName: "util") pod "2b93dd91-7115-4f80-b33c-368a895deda9" (UID: "2b93dd91-7115-4f80-b33c-368a895deda9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:01:41.251235 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.251137 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v776f\" (UniqueName: \"kubernetes.io/projected/2b93dd91-7115-4f80-b33c-368a895deda9-kube-api-access-v776f\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:01:41.251235 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.251171 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b93dd91-7115-4f80-b33c-368a895deda9-bundle\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:01:41.251235 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.251181 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b93dd91-7115-4f80-b33c-368a895deda9-util\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:01:41.954339 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.954300 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" event={"ID":"2b93dd91-7115-4f80-b33c-368a895deda9","Type":"ContainerDied","Data":"700189b05b8d37d738652624697f47420f9658a6dab6732dae9f504fb7fdfff9"} Apr 22 20:01:41.954339 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.954345 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="700189b05b8d37d738652624697f47420f9658a6dab6732dae9f504fb7fdfff9" Apr 22 20:01:41.954545 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:41.954321 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cpqmpt" Apr 22 20:01:47.449821 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.449769 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr"] Apr 22 20:01:47.450236 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.450065 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b93dd91-7115-4f80-b33c-368a895deda9" containerName="extract" Apr 22 20:01:47.450236 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.450077 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b93dd91-7115-4f80-b33c-368a895deda9" containerName="extract" Apr 22 20:01:47.450236 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.450096 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b93dd91-7115-4f80-b33c-368a895deda9" containerName="util" Apr 22 20:01:47.450236 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.450102 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b93dd91-7115-4f80-b33c-368a895deda9" containerName="util" Apr 22 20:01:47.450236 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.450108 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b93dd91-7115-4f80-b33c-368a895deda9" containerName="pull" Apr 22 20:01:47.450236 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.450114 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b93dd91-7115-4f80-b33c-368a895deda9" containerName="pull" Apr 22 20:01:47.450236 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.450159 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b93dd91-7115-4f80-b33c-368a895deda9" containerName="extract" Apr 22 20:01:47.457348 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.457318 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" Apr 22 20:01:47.461608 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.461581 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 20:01:47.461608 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.461598 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 20:01:47.462906 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.462885 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 20:01:47.463038 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.462925 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-ml4zg\"" Apr 22 20:01:47.485901 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.485870 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr"] Apr 22 20:01:47.604873 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.604833 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz5ql\" (UniqueName: \"kubernetes.io/projected/9311ca5d-0e20-4569-a68b-edbbd7f3f8a9-kube-api-access-gz5ql\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr\" (UID: \"9311ca5d-0e20-4569-a68b-edbbd7f3f8a9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" Apr 22 20:01:47.605133 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.604923 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/9311ca5d-0e20-4569-a68b-edbbd7f3f8a9-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr\" (UID: \"9311ca5d-0e20-4569-a68b-edbbd7f3f8a9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" Apr 22 20:01:47.705398 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.705298 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz5ql\" (UniqueName: \"kubernetes.io/projected/9311ca5d-0e20-4569-a68b-edbbd7f3f8a9-kube-api-access-gz5ql\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr\" (UID: \"9311ca5d-0e20-4569-a68b-edbbd7f3f8a9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" Apr 22 20:01:47.705398 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.705386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/9311ca5d-0e20-4569-a68b-edbbd7f3f8a9-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr\" (UID: \"9311ca5d-0e20-4569-a68b-edbbd7f3f8a9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" Apr 22 20:01:47.707840 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.707795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/9311ca5d-0e20-4569-a68b-edbbd7f3f8a9-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr\" (UID: \"9311ca5d-0e20-4569-a68b-edbbd7f3f8a9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" Apr 22 20:01:47.715664 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.715641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz5ql\" (UniqueName: \"kubernetes.io/projected/9311ca5d-0e20-4569-a68b-edbbd7f3f8a9-kube-api-access-gz5ql\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr\" (UID: \"9311ca5d-0e20-4569-a68b-edbbd7f3f8a9\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" Apr 22 20:01:47.767677 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.767644 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" Apr 22 20:01:47.900849 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.900729 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr"] Apr 22 20:01:47.903669 ip-10-0-135-72 kubenswrapper[2578]: W0422 20:01:47.903642 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9311ca5d_0e20_4569_a68b_edbbd7f3f8a9.slice/crio-3dba4db5e9f04bb3bd85ae4b9167b23f54437481b8fb84c74c0fe504899ce67d WatchSource:0}: Error finding container 3dba4db5e9f04bb3bd85ae4b9167b23f54437481b8fb84c74c0fe504899ce67d: Status 404 returned error can't find the container with id 3dba4db5e9f04bb3bd85ae4b9167b23f54437481b8fb84c74c0fe504899ce67d Apr 22 20:01:47.972872 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:47.972779 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" event={"ID":"9311ca5d-0e20-4569-a68b-edbbd7f3f8a9","Type":"ContainerStarted","Data":"3dba4db5e9f04bb3bd85ae4b9167b23f54437481b8fb84c74c0fe504899ce67d"} Apr 22 20:01:52.657969 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.657934 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-vqfgz"] Apr 22 20:01:52.661088 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.661069 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:01:52.663463 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.663437 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 20:01:52.663904 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.663871 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 20:01:52.663998 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.663927 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5qpst\"" Apr 22 20:01:52.672971 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.672951 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-vqfgz"] Apr 22 20:01:52.747304 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.747273 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:01:52.747477 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.747333 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/1772922f-4ae3-4a72-999e-9ebc4ed549ff-cabundle0\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:01:52.747477 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.747423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49qhx\" (UniqueName: \"kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-kube-api-access-49qhx\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:01:52.848794 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.848753 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/1772922f-4ae3-4a72-999e-9ebc4ed549ff-cabundle0\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:01:52.848998 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.848802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49qhx\" (UniqueName: \"kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-kube-api-access-49qhx\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:01:52.848998 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.848861 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:01:52.848998 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:52.848983 2578 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 20:01:52.848998 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:52.848999 2578 secret.go:281] references non-existent secret key: ca.crt Apr 22 20:01:52.849207 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:52.849007 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 20:01:52.849207 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:52.849019 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vqfgz: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 20:01:52.849207 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:52.849079 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates podName:1772922f-4ae3-4a72-999e-9ebc4ed549ff nodeName:}" failed. No retries permitted until 2026-04-22 20:01:53.349061679 +0000 UTC m=+247.879783670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates") pod "keda-operator-ffbb595cb-vqfgz" (UID: "1772922f-4ae3-4a72-999e-9ebc4ed549ff") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 20:01:52.849458 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.849436 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/1772922f-4ae3-4a72-999e-9ebc4ed549ff-cabundle0\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:01:52.858222 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.858194 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49qhx\" (UniqueName: \"kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-kube-api-access-49qhx\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:01:52.994480 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.994383 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" event={"ID":"9311ca5d-0e20-4569-a68b-edbbd7f3f8a9","Type":"ContainerStarted","Data":"70f4c08fda7e253e220a1776824fff217c5e46612a472c2451f9b6ae62dd68fd"} Apr 22 20:01:52.994480 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:52.994456 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" Apr 22 20:01:53.016762 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.016711 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" podStartSLOduration=1.823482254 podStartE2EDuration="6.016698231s" podCreationTimestamp="2026-04-22 20:01:47 +0000 UTC" firstStartedPulling="2026-04-22 20:01:47.905440864 +0000 UTC m=+242.436162851" lastFinishedPulling="2026-04-22 20:01:52.098656833 +0000 UTC m=+246.629378828" observedRunningTime="2026-04-22 20:01:53.015090831 +0000 UTC m=+247.545812839" watchObservedRunningTime="2026-04-22 20:01:53.016698231 +0000 UTC m=+247.547420240" Apr 22 20:01:53.095045 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.095017 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp"] Apr 22 20:01:53.102171 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.102145 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:01:53.104525 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.104502 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 20:01:53.106752 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.106732 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp"] Apr 22 20:01:53.253574 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.253471 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:01:53.253574 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.253531 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b83f6777-1696-44bf-8b30-ad62a9417641-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:01:53.253803 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.253607 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76rls\" (UniqueName: \"kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-kube-api-access-76rls\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:01:53.354526 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.354474 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:01:53.354729 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.354548 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:01:53.354729 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.354604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b83f6777-1696-44bf-8b30-ad62a9417641-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:01:53.354729 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:53.354624 2578 secret.go:281] references non-existent secret key: ca.crt Apr 22 20:01:53.354729 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:53.354653 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 20:01:53.354729 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:53.354663 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vqfgz: references non-existent secret key: ca.crt Apr 22 20:01:53.354729 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.354668 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76rls\" (UniqueName: \"kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-kube-api-access-76rls\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:01:53.354729 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:53.354702 2578 secret.go:281] references non-existent secret key: tls.crt Apr 22 20:01:53.354729 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:53.354730 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 20:01:53.355074 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:53.354752 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp: references non-existent secret key: tls.crt Apr 22 20:01:53.355074 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:53.354732 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates podName:1772922f-4ae3-4a72-999e-9ebc4ed549ff nodeName:}" failed. No retries permitted until 2026-04-22 20:01:54.354713887 +0000 UTC m=+248.885435875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates") pod "keda-operator-ffbb595cb-vqfgz" (UID: "1772922f-4ae3-4a72-999e-9ebc4ed549ff") : references non-existent secret key: ca.crt Apr 22 20:01:53.355074 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:53.354853 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates podName:b83f6777-1696-44bf-8b30-ad62a9417641 nodeName:}" failed. No retries permitted until 2026-04-22 20:01:53.854829748 +0000 UTC m=+248.385551751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates") pod "keda-metrics-apiserver-7c9f485588-pjcgp" (UID: "b83f6777-1696-44bf-8b30-ad62a9417641") : references non-existent secret key: tls.crt Apr 22 20:01:53.355074 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.355063 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b83f6777-1696-44bf-8b30-ad62a9417641-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:01:53.363359 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.363333 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76rls\" (UniqueName: \"kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-kube-api-access-76rls\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:01:53.390270 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.390238 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-2j6hj"] Apr 22 20:01:53.393440 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.393423 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-2j6hj" Apr 22 20:01:53.396055 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.396036 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 20:01:53.402365 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.402337 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-2j6hj"] Apr 22 20:01:53.557081 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.556990 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf9rp\" (UniqueName: \"kubernetes.io/projected/09dfedc6-8087-451c-9ab0-4145e5708f8e-kube-api-access-wf9rp\") pod \"keda-admission-cf49989db-2j6hj\" (UID: \"09dfedc6-8087-451c-9ab0-4145e5708f8e\") " pod="openshift-keda/keda-admission-cf49989db-2j6hj" Apr 22 20:01:53.557081 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.557033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/09dfedc6-8087-451c-9ab0-4145e5708f8e-certificates\") pod \"keda-admission-cf49989db-2j6hj\" (UID: \"09dfedc6-8087-451c-9ab0-4145e5708f8e\") " pod="openshift-keda/keda-admission-cf49989db-2j6hj" Apr 22 20:01:53.658537 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.658501 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wf9rp\" (UniqueName: \"kubernetes.io/projected/09dfedc6-8087-451c-9ab0-4145e5708f8e-kube-api-access-wf9rp\") pod \"keda-admission-cf49989db-2j6hj\" (UID: \"09dfedc6-8087-451c-9ab0-4145e5708f8e\") " pod="openshift-keda/keda-admission-cf49989db-2j6hj" Apr 22 20:01:53.658933 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.658558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/09dfedc6-8087-451c-9ab0-4145e5708f8e-certificates\") pod \"keda-admission-cf49989db-2j6hj\" (UID: \"09dfedc6-8087-451c-9ab0-4145e5708f8e\") " pod="openshift-keda/keda-admission-cf49989db-2j6hj" Apr 22 20:01:53.661440 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.661416 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/09dfedc6-8087-451c-9ab0-4145e5708f8e-certificates\") pod \"keda-admission-cf49989db-2j6hj\" (UID: \"09dfedc6-8087-451c-9ab0-4145e5708f8e\") " pod="openshift-keda/keda-admission-cf49989db-2j6hj" Apr 22 20:01:53.666175 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.666155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf9rp\" (UniqueName: \"kubernetes.io/projected/09dfedc6-8087-451c-9ab0-4145e5708f8e-kube-api-access-wf9rp\") pod \"keda-admission-cf49989db-2j6hj\" (UID: \"09dfedc6-8087-451c-9ab0-4145e5708f8e\") " pod="openshift-keda/keda-admission-cf49989db-2j6hj" Apr 22 20:01:53.704176 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.704134 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-2j6hj" Apr 22 20:01:53.831892 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.831802 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-2j6hj"] Apr 22 20:01:53.835092 ip-10-0-135-72 kubenswrapper[2578]: W0422 20:01:53.835064 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09dfedc6_8087_451c_9ab0_4145e5708f8e.slice/crio-46174b861f69dfcb5b33831269ed8b67f862953b724d54fa5c2dd78ebaff2f77 WatchSource:0}: Error finding container 46174b861f69dfcb5b33831269ed8b67f862953b724d54fa5c2dd78ebaff2f77: Status 404 returned error can't find the container with id 46174b861f69dfcb5b33831269ed8b67f862953b724d54fa5c2dd78ebaff2f77 Apr 22 20:01:53.861316 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.861287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:01:53.861433 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:53.861388 2578 secret.go:281] references non-existent secret key: tls.crt Apr 22 20:01:53.861433 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:53.861399 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 20:01:53.861433 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:53.861417 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp: references non-existent secret key: tls.crt Apr 22 20:01:53.861551 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:53.861463 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates podName:b83f6777-1696-44bf-8b30-ad62a9417641 nodeName:}" failed. No retries permitted until 2026-04-22 20:01:54.861449139 +0000 UTC m=+249.392171126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates") pod "keda-metrics-apiserver-7c9f485588-pjcgp" (UID: "b83f6777-1696-44bf-8b30-ad62a9417641") : references non-existent secret key: tls.crt Apr 22 20:01:53.998443 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:53.998403 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-2j6hj" event={"ID":"09dfedc6-8087-451c-9ab0-4145e5708f8e","Type":"ContainerStarted","Data":"46174b861f69dfcb5b33831269ed8b67f862953b724d54fa5c2dd78ebaff2f77"} Apr 22 20:01:54.365630 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:54.365593 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:01:54.365831 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:54.365758 2578 secret.go:281] references non-existent secret key: ca.crt Apr 22 20:01:54.365831 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:54.365780 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 20:01:54.365831 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:54.365792 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vqfgz: references non-existent secret key: ca.crt Apr 22 20:01:54.366055 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:54.365875 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates podName:1772922f-4ae3-4a72-999e-9ebc4ed549ff nodeName:}" failed. No retries permitted until 2026-04-22 20:01:56.365859847 +0000 UTC m=+250.896581833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates") pod "keda-operator-ffbb595cb-vqfgz" (UID: "1772922f-4ae3-4a72-999e-9ebc4ed549ff") : references non-existent secret key: ca.crt Apr 22 20:01:54.870181 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:54.870139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:01:54.870546 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:54.870316 2578 secret.go:281] references non-existent secret key: tls.crt Apr 22 20:01:54.870546 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:54.870340 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 20:01:54.870546 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:54.870364 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp: references non-existent secret key: tls.crt Apr 22 20:01:54.870546 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:54.870428 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates podName:b83f6777-1696-44bf-8b30-ad62a9417641 nodeName:}" failed. No retries permitted until 2026-04-22 20:01:56.870409474 +0000 UTC m=+251.401131475 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates") pod "keda-metrics-apiserver-7c9f485588-pjcgp" (UID: "b83f6777-1696-44bf-8b30-ad62a9417641") : references non-existent secret key: tls.crt Apr 22 20:01:56.006716 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:56.006673 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-2j6hj" event={"ID":"09dfedc6-8087-451c-9ab0-4145e5708f8e","Type":"ContainerStarted","Data":"dc627160b887f1afb223d146db449a7bd97da53110c0aa1ecb45934377a86f52"} Apr 22 20:01:56.007187 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:56.006787 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-2j6hj" Apr 22 20:01:56.025940 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:56.025896 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-2j6hj" podStartSLOduration=1.6162992649999999 podStartE2EDuration="3.025883716s" podCreationTimestamp="2026-04-22 20:01:53 +0000 UTC" firstStartedPulling="2026-04-22 20:01:53.836330229 +0000 UTC m=+248.367052217" lastFinishedPulling="2026-04-22 20:01:55.24591466 +0000 UTC m=+249.776636668" observedRunningTime="2026-04-22 20:01:56.023960296 +0000 UTC m=+250.554682304" watchObservedRunningTime="2026-04-22 20:01:56.025883716 +0000 UTC m=+250.556605724" Apr 22 20:01:56.382025 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:56.381997 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:01:56.382170 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:56.382101 2578 secret.go:281] references non-existent secret key: ca.crt Apr 22 20:01:56.382170 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:56.382113 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 20:01:56.382170 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:56.382121 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-vqfgz: references non-existent secret key: ca.crt Apr 22 20:01:56.382289 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:56.382174 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates podName:1772922f-4ae3-4a72-999e-9ebc4ed549ff nodeName:}" failed. No retries permitted until 2026-04-22 20:02:00.38216077 +0000 UTC m=+254.912882756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates") pod "keda-operator-ffbb595cb-vqfgz" (UID: "1772922f-4ae3-4a72-999e-9ebc4ed549ff") : references non-existent secret key: ca.crt Apr 22 20:01:56.885519 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:01:56.885492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:01:56.885672 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:56.885623 2578 secret.go:281] references non-existent secret key: tls.crt Apr 22 20:01:56.885672 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:56.885642 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 20:01:56.885672 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:56.885659 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp: references non-existent secret key: tls.crt Apr 22 20:01:56.885773 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:01:56.885706 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates podName:b83f6777-1696-44bf-8b30-ad62a9417641 nodeName:}" failed. No retries permitted until 2026-04-22 20:02:00.885693352 +0000 UTC m=+255.416415338 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates") pod "keda-metrics-apiserver-7c9f485588-pjcgp" (UID: "b83f6777-1696-44bf-8b30-ad62a9417641") : references non-existent secret key: tls.crt Apr 22 20:02:00.412625 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:00.412568 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:02:00.415122 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:00.415100 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1772922f-4ae3-4a72-999e-9ebc4ed549ff-certificates\") pod \"keda-operator-ffbb595cb-vqfgz\" (UID: \"1772922f-4ae3-4a72-999e-9ebc4ed549ff\") " pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:02:00.471123 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:00.471093 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:02:00.588701 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:00.588676 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-vqfgz"] Apr 22 20:02:00.591169 ip-10-0-135-72 kubenswrapper[2578]: W0422 20:02:00.591140 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1772922f_4ae3_4a72_999e_9ebc4ed549ff.slice/crio-b789373a7e109d6a8e1cee783dfde764e13b46fc403f265aa65f0be485599cb4 WatchSource:0}: Error finding container b789373a7e109d6a8e1cee783dfde764e13b46fc403f265aa65f0be485599cb4: Status 404 returned error can't find the container with id b789373a7e109d6a8e1cee783dfde764e13b46fc403f265aa65f0be485599cb4 Apr 22 20:02:00.916426 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:00.916390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:02:00.919009 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:00.918981 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b83f6777-1696-44bf-8b30-ad62a9417641-certificates\") pod \"keda-metrics-apiserver-7c9f485588-pjcgp\" (UID: \"b83f6777-1696-44bf-8b30-ad62a9417641\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:02:01.022403 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:01.022367 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" event={"ID":"1772922f-4ae3-4a72-999e-9ebc4ed549ff","Type":"ContainerStarted","Data":"b789373a7e109d6a8e1cee783dfde764e13b46fc403f265aa65f0be485599cb4"} Apr 22 20:02:01.214618 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:01.214536 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:02:01.339206 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:01.339179 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp"] Apr 22 20:02:01.341739 ip-10-0-135-72 kubenswrapper[2578]: W0422 20:02:01.341708 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb83f6777_1696_44bf_8b30_ad62a9417641.slice/crio-b3c987f0bd23d9e90fe51371ba9e7e7879ba407a05fc3e8b43a160e251b7b085 WatchSource:0}: Error finding container b3c987f0bd23d9e90fe51371ba9e7e7879ba407a05fc3e8b43a160e251b7b085: Status 404 returned error can't find the container with id b3c987f0bd23d9e90fe51371ba9e7e7879ba407a05fc3e8b43a160e251b7b085 Apr 22 20:02:02.026826 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:02.026781 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" event={"ID":"b83f6777-1696-44bf-8b30-ad62a9417641","Type":"ContainerStarted","Data":"b3c987f0bd23d9e90fe51371ba9e7e7879ba407a05fc3e8b43a160e251b7b085"} Apr 22 20:02:06.046288 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:06.046262 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:02:06.046631 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:06.046293 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" event={"ID":"1772922f-4ae3-4a72-999e-9ebc4ed549ff","Type":"ContainerStarted","Data":"871408246dcb5a1e7cf77097771e5704d66c55452857bdb92390540965b01392"} Apr 22 20:02:06.046631 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:06.046306 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" event={"ID":"b83f6777-1696-44bf-8b30-ad62a9417641","Type":"ContainerStarted","Data":"d9297653c121514364df8443637fb361ea1449d6c2c61f8e8064d61a7745a497"} Apr 22 20:02:06.101617 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:06.101568 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" podStartSLOduration=9.134324725 podStartE2EDuration="14.101555595s" podCreationTimestamp="2026-04-22 20:01:52 +0000 UTC" firstStartedPulling="2026-04-22 20:02:00.592374105 +0000 UTC m=+255.123096094" lastFinishedPulling="2026-04-22 20:02:05.559604974 +0000 UTC m=+260.090326964" observedRunningTime="2026-04-22 20:02:06.100326274 +0000 UTC m=+260.631048305" watchObservedRunningTime="2026-04-22 20:02:06.101555595 +0000 UTC m=+260.632277603" Apr 22 20:02:06.116002 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:06.115961 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" podStartSLOduration=8.898408716 podStartE2EDuration="13.11594935s" podCreationTimestamp="2026-04-22 20:01:53 +0000 UTC" firstStartedPulling="2026-04-22 20:02:01.343038722 +0000 UTC m=+255.873760709" lastFinishedPulling="2026-04-22 20:02:05.560579353 +0000 UTC m=+260.091301343" observedRunningTime="2026-04-22 20:02:06.115197047 +0000 UTC m=+260.645919056" watchObservedRunningTime="2026-04-22 20:02:06.11594935 +0000 UTC m=+260.646671358" Apr 22 20:02:07.048304 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:07.048264 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:02:14.000853 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:14.000796 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wvkrr" Apr 22 20:02:17.011561 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:17.011532 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-2j6hj" Apr 22 20:02:17.052829 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:17.052789 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-pjcgp" Apr 22 20:02:28.053502 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:28.053469 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-vqfgz" Apr 22 20:02:45.914550 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:45.914519 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/2.log" Apr 22 20:02:45.915964 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:45.915937 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/2.log" Apr 22 20:02:45.933530 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:45.933498 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 20:02:58.479708 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.479669 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-xlkfm"] Apr 22 20:02:58.481768 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.481748 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-xlkfm" Apr 22 20:02:58.484027 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.484008 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 20:02:58.484776 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.484759 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 20:02:58.484891 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.484776 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-6xk8w\"" Apr 22 20:02:58.484891 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.484791 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 20:02:58.491649 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.491628 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-xlkfm"] Apr 22 20:02:58.568591 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.568559 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnxkq\" (UniqueName: \"kubernetes.io/projected/f8ee8283-a178-4e05-8ef0-bef0bae22044-kube-api-access-lnxkq\") pod \"seaweedfs-86cc847c5c-xlkfm\" (UID: \"f8ee8283-a178-4e05-8ef0-bef0bae22044\") " pod="kserve/seaweedfs-86cc847c5c-xlkfm" Apr 22 20:02:58.568748 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.568620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f8ee8283-a178-4e05-8ef0-bef0bae22044-data\") pod \"seaweedfs-86cc847c5c-xlkfm\" (UID: \"f8ee8283-a178-4e05-8ef0-bef0bae22044\") " pod="kserve/seaweedfs-86cc847c5c-xlkfm" Apr 22 20:02:58.670091 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.670057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnxkq\" (UniqueName: \"kubernetes.io/projected/f8ee8283-a178-4e05-8ef0-bef0bae22044-kube-api-access-lnxkq\") pod \"seaweedfs-86cc847c5c-xlkfm\" (UID: \"f8ee8283-a178-4e05-8ef0-bef0bae22044\") " pod="kserve/seaweedfs-86cc847c5c-xlkfm" Apr 22 20:02:58.670091 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.670102 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f8ee8283-a178-4e05-8ef0-bef0bae22044-data\") pod \"seaweedfs-86cc847c5c-xlkfm\" (UID: \"f8ee8283-a178-4e05-8ef0-bef0bae22044\") " pod="kserve/seaweedfs-86cc847c5c-xlkfm" Apr 22 20:02:58.670472 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.670455 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f8ee8283-a178-4e05-8ef0-bef0bae22044-data\") pod \"seaweedfs-86cc847c5c-xlkfm\" (UID: \"f8ee8283-a178-4e05-8ef0-bef0bae22044\") " pod="kserve/seaweedfs-86cc847c5c-xlkfm" Apr 22 20:02:58.680047 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.680021 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnxkq\" (UniqueName: \"kubernetes.io/projected/f8ee8283-a178-4e05-8ef0-bef0bae22044-kube-api-access-lnxkq\") pod \"seaweedfs-86cc847c5c-xlkfm\" (UID: \"f8ee8283-a178-4e05-8ef0-bef0bae22044\") " pod="kserve/seaweedfs-86cc847c5c-xlkfm" Apr 22 20:02:58.792148 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.792086 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-xlkfm" Apr 22 20:02:58.915615 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.915482 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-xlkfm"] Apr 22 20:02:58.918584 ip-10-0-135-72 kubenswrapper[2578]: W0422 20:02:58.918553 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8ee8283_a178_4e05_8ef0_bef0bae22044.slice/crio-01679e80f7c65e8836a4574b2cda7b7bd38c673b048343aa153d847442bbdfec WatchSource:0}: Error finding container 01679e80f7c65e8836a4574b2cda7b7bd38c673b048343aa153d847442bbdfec: Status 404 returned error can't find the container with id 01679e80f7c65e8836a4574b2cda7b7bd38c673b048343aa153d847442bbdfec Apr 22 20:02:58.920162 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:58.920145 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:02:59.215072 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:02:59.215037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-xlkfm" event={"ID":"f8ee8283-a178-4e05-8ef0-bef0bae22044","Type":"ContainerStarted","Data":"01679e80f7c65e8836a4574b2cda7b7bd38c673b048343aa153d847442bbdfec"} Apr 22 20:03:02.228120 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:02.228086 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-xlkfm" event={"ID":"f8ee8283-a178-4e05-8ef0-bef0bae22044","Type":"ContainerStarted","Data":"9cddcfd9d00f0fc3363a7ae130385ca352a5638d91614c28ad8f8562762d1e4c"} Apr 22 20:03:02.228478 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:02.228159 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-xlkfm" Apr 22 20:03:02.244620 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:02.244567 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-xlkfm" podStartSLOduration=1.327817895 podStartE2EDuration="4.244548175s" podCreationTimestamp="2026-04-22 20:02:58 +0000 UTC" firstStartedPulling="2026-04-22 20:02:58.920272411 +0000 UTC m=+313.450994398" lastFinishedPulling="2026-04-22 20:03:01.837002678 +0000 UTC m=+316.367724678" observedRunningTime="2026-04-22 20:03:02.243634663 +0000 UTC m=+316.774356674" watchObservedRunningTime="2026-04-22 20:03:02.244548175 +0000 UTC m=+316.775270186" Apr 22 20:03:08.234279 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:08.234249 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-xlkfm" Apr 22 20:03:34.334728 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.334648 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-665c47d676-47g8t"] Apr 22 20:03:34.336955 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.336939 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-47g8t" Apr 22 20:03:34.340072 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.340046 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-gz55x\"" Apr 22 20:03:34.340072 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.340061 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 20:03:34.344495 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.344473 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-47g8t"] Apr 22 20:03:34.349035 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.349010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/220fa352-1edc-488c-acc4-5d90d6a02793-cert\") pod \"kserve-controller-manager-665c47d676-47g8t\" (UID: \"220fa352-1edc-488c-acc4-5d90d6a02793\") " pod="kserve/kserve-controller-manager-665c47d676-47g8t" Apr 22 20:03:34.349125 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.349068 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf6jt\" (UniqueName: \"kubernetes.io/projected/220fa352-1edc-488c-acc4-5d90d6a02793-kube-api-access-vf6jt\") pod \"kserve-controller-manager-665c47d676-47g8t\" (UID: \"220fa352-1edc-488c-acc4-5d90d6a02793\") " pod="kserve/kserve-controller-manager-665c47d676-47g8t" Apr 22 20:03:34.449948 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.449914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/220fa352-1edc-488c-acc4-5d90d6a02793-cert\") pod \"kserve-controller-manager-665c47d676-47g8t\" (UID: \"220fa352-1edc-488c-acc4-5d90d6a02793\") " pod="kserve/kserve-controller-manager-665c47d676-47g8t" Apr 22 20:03:34.450136 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.449958 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6jt\" (UniqueName: \"kubernetes.io/projected/220fa352-1edc-488c-acc4-5d90d6a02793-kube-api-access-vf6jt\") pod \"kserve-controller-manager-665c47d676-47g8t\" (UID: \"220fa352-1edc-488c-acc4-5d90d6a02793\") " pod="kserve/kserve-controller-manager-665c47d676-47g8t" Apr 22 20:03:34.452519 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.452497 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/220fa352-1edc-488c-acc4-5d90d6a02793-cert\") pod \"kserve-controller-manager-665c47d676-47g8t\" (UID: \"220fa352-1edc-488c-acc4-5d90d6a02793\") " pod="kserve/kserve-controller-manager-665c47d676-47g8t" Apr 22 20:03:34.457541 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.457517 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6jt\" (UniqueName: \"kubernetes.io/projected/220fa352-1edc-488c-acc4-5d90d6a02793-kube-api-access-vf6jt\") pod \"kserve-controller-manager-665c47d676-47g8t\" (UID: \"220fa352-1edc-488c-acc4-5d90d6a02793\") " pod="kserve/kserve-controller-manager-665c47d676-47g8t" Apr 22 20:03:34.646940 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.646909 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-47g8t" Apr 22 20:03:34.767708 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:34.767685 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-47g8t"] Apr 22 20:03:34.770097 ip-10-0-135-72 kubenswrapper[2578]: W0422 20:03:34.770073 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod220fa352_1edc_488c_acc4_5d90d6a02793.slice/crio-37e42b6a1849a8a6e6d03990b641308e907ad6384227eee67eab832348dca1e7 WatchSource:0}: Error finding container 37e42b6a1849a8a6e6d03990b641308e907ad6384227eee67eab832348dca1e7: Status 404 returned error can't find the container with id 37e42b6a1849a8a6e6d03990b641308e907ad6384227eee67eab832348dca1e7 Apr 22 20:03:35.339002 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:35.338951 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-47g8t" event={"ID":"220fa352-1edc-488c-acc4-5d90d6a02793","Type":"ContainerStarted","Data":"37e42b6a1849a8a6e6d03990b641308e907ad6384227eee67eab832348dca1e7"} Apr 22 20:03:37.349776 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:37.349743 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-47g8t" event={"ID":"220fa352-1edc-488c-acc4-5d90d6a02793","Type":"ContainerStarted","Data":"1dbe8cf3f5dea03964b2ebca7090f3a52fd88d24db4e8aa441395a1f00bffd0e"} Apr 22 20:03:37.350158 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:37.349872 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-665c47d676-47g8t" Apr 22 20:03:37.369206 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:03:37.369153 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-665c47d676-47g8t" podStartSLOduration=0.929230855 podStartE2EDuration="3.369138587s" podCreationTimestamp="2026-04-22 20:03:34 +0000 UTC" firstStartedPulling="2026-04-22 20:03:34.771345029 +0000 UTC m=+349.302067015" lastFinishedPulling="2026-04-22 20:03:37.211252745 +0000 UTC m=+351.741974747" observedRunningTime="2026-04-22 20:03:37.3666384 +0000 UTC m=+351.897360390" watchObservedRunningTime="2026-04-22 20:03:37.369138587 +0000 UTC m=+351.899860647" Apr 22 20:04:08.358379 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:08.358340 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-665c47d676-47g8t" Apr 22 20:04:09.207622 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:09.207584 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-nfz97"] Apr 22 20:04:09.212245 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:09.212226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-nfz97" Apr 22 20:04:09.216974 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:09.216953 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 20:04:09.217097 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:09.216984 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-cqlcz\"" Apr 22 20:04:09.232419 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:09.232395 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-nfz97"] Apr 22 20:04:09.344243 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:09.344204 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wjvq\" (UniqueName: \"kubernetes.io/projected/2fa8c9a2-6512-4791-906f-3f30b6597438-kube-api-access-9wjvq\") pod \"model-serving-api-86f7b4b499-nfz97\" (UID: \"2fa8c9a2-6512-4791-906f-3f30b6597438\") " pod="kserve/model-serving-api-86f7b4b499-nfz97" Apr 22 20:04:09.344412 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:09.344257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2fa8c9a2-6512-4791-906f-3f30b6597438-tls-certs\") pod \"model-serving-api-86f7b4b499-nfz97\" (UID: \"2fa8c9a2-6512-4791-906f-3f30b6597438\") " pod="kserve/model-serving-api-86f7b4b499-nfz97" Apr 22 20:04:09.445138 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:09.445098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2fa8c9a2-6512-4791-906f-3f30b6597438-tls-certs\") pod \"model-serving-api-86f7b4b499-nfz97\" (UID: \"2fa8c9a2-6512-4791-906f-3f30b6597438\") " pod="kserve/model-serving-api-86f7b4b499-nfz97" Apr 22 20:04:09.445495 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:09.445187 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wjvq\" (UniqueName: \"kubernetes.io/projected/2fa8c9a2-6512-4791-906f-3f30b6597438-kube-api-access-9wjvq\") pod \"model-serving-api-86f7b4b499-nfz97\" (UID: \"2fa8c9a2-6512-4791-906f-3f30b6597438\") " pod="kserve/model-serving-api-86f7b4b499-nfz97" Apr 22 20:04:09.445495 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:04:09.445247 2578 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 22 20:04:09.445495 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:04:09.445320 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fa8c9a2-6512-4791-906f-3f30b6597438-tls-certs podName:2fa8c9a2-6512-4791-906f-3f30b6597438 nodeName:}" failed. No retries permitted until 2026-04-22 20:04:09.94530429 +0000 UTC m=+384.476026282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/2fa8c9a2-6512-4791-906f-3f30b6597438-tls-certs") pod "model-serving-api-86f7b4b499-nfz97" (UID: "2fa8c9a2-6512-4791-906f-3f30b6597438") : secret "model-serving-api-tls" not found Apr 22 20:04:09.455489 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:09.455454 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wjvq\" (UniqueName: \"kubernetes.io/projected/2fa8c9a2-6512-4791-906f-3f30b6597438-kube-api-access-9wjvq\") pod \"model-serving-api-86f7b4b499-nfz97\" (UID: \"2fa8c9a2-6512-4791-906f-3f30b6597438\") " pod="kserve/model-serving-api-86f7b4b499-nfz97" Apr 22 20:04:09.950179 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:09.950142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2fa8c9a2-6512-4791-906f-3f30b6597438-tls-certs\") pod \"model-serving-api-86f7b4b499-nfz97\" (UID: \"2fa8c9a2-6512-4791-906f-3f30b6597438\") " pod="kserve/model-serving-api-86f7b4b499-nfz97" Apr 22 20:04:09.952672 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:09.952652 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2fa8c9a2-6512-4791-906f-3f30b6597438-tls-certs\") pod \"model-serving-api-86f7b4b499-nfz97\" (UID: \"2fa8c9a2-6512-4791-906f-3f30b6597438\") " pod="kserve/model-serving-api-86f7b4b499-nfz97" Apr 22 20:04:10.122180 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:10.122140 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-nfz97" Apr 22 20:04:10.243632 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:10.243608 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-nfz97"] Apr 22 20:04:10.245888 ip-10-0-135-72 kubenswrapper[2578]: W0422 20:04:10.245861 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fa8c9a2_6512_4791_906f_3f30b6597438.slice/crio-047198ec02e02043903af70a6609166894c910761d698c605105ecb933afcbc7 WatchSource:0}: Error finding container 047198ec02e02043903af70a6609166894c910761d698c605105ecb933afcbc7: Status 404 returned error can't find the container with id 047198ec02e02043903af70a6609166894c910761d698c605105ecb933afcbc7 Apr 22 20:04:10.452502 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:10.452462 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-nfz97" event={"ID":"2fa8c9a2-6512-4791-906f-3f30b6597438","Type":"ContainerStarted","Data":"047198ec02e02043903af70a6609166894c910761d698c605105ecb933afcbc7"} Apr 22 20:04:12.461043 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:12.461007 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-nfz97" event={"ID":"2fa8c9a2-6512-4791-906f-3f30b6597438","Type":"ContainerStarted","Data":"cefaec800e1f4d2398529f2192a197e07963119c4925636afe0e2354b8990f1c"} Apr 22 20:04:12.461477 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:12.461140 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-nfz97" Apr 22 20:04:12.476617 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:12.476565 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-nfz97" podStartSLOduration=1.731903316 podStartE2EDuration="3.476550322s" podCreationTimestamp="2026-04-22 20:04:09 +0000 UTC" firstStartedPulling="2026-04-22 20:04:10.248102495 +0000 UTC m=+384.778824483" lastFinishedPulling="2026-04-22 20:04:11.992749503 +0000 UTC m=+386.523471489" observedRunningTime="2026-04-22 20:04:12.475673156 +0000 UTC m=+387.006395178" watchObservedRunningTime="2026-04-22 20:04:12.476550322 +0000 UTC m=+387.007272331" Apr 22 20:04:23.469169 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:23.469135 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-nfz97" Apr 22 20:04:35.408406 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.408374 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-788648b64b-vjdwd"] Apr 22 20:04:35.411216 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.411194 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.422160 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.422136 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-788648b64b-vjdwd"] Apr 22 20:04:35.448713 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.448670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk2hr\" (UniqueName: \"kubernetes.io/projected/9ff180c2-aeff-48f6-823b-62c92051e703-kube-api-access-xk2hr\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.448845 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.448768 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ff180c2-aeff-48f6-823b-62c92051e703-trusted-ca-bundle\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.448845 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.448792 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ff180c2-aeff-48f6-823b-62c92051e703-service-ca\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.448934 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.448853 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ff180c2-aeff-48f6-823b-62c92051e703-console-serving-cert\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.448934 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.448890 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ff180c2-aeff-48f6-823b-62c92051e703-console-config\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.449010 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.448934 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ff180c2-aeff-48f6-823b-62c92051e703-console-oauth-config\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.449010 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.448962 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ff180c2-aeff-48f6-823b-62c92051e703-oauth-serving-cert\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.549520 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.549494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ff180c2-aeff-48f6-823b-62c92051e703-console-oauth-config\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.549643 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.549531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ff180c2-aeff-48f6-823b-62c92051e703-oauth-serving-cert\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.549643 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.549579 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xk2hr\" (UniqueName: \"kubernetes.io/projected/9ff180c2-aeff-48f6-823b-62c92051e703-kube-api-access-xk2hr\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.549643 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.549623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ff180c2-aeff-48f6-823b-62c92051e703-trusted-ca-bundle\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.549835 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.549651 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ff180c2-aeff-48f6-823b-62c92051e703-service-ca\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.549835 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.549677 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ff180c2-aeff-48f6-823b-62c92051e703-console-serving-cert\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.549835 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.549701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ff180c2-aeff-48f6-823b-62c92051e703-console-config\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.550511 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.550485 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ff180c2-aeff-48f6-823b-62c92051e703-oauth-serving-cert\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.550620 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.550587 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ff180c2-aeff-48f6-823b-62c92051e703-service-ca\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.550620 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.550591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ff180c2-aeff-48f6-823b-62c92051e703-trusted-ca-bundle\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.550715 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.550660 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ff180c2-aeff-48f6-823b-62c92051e703-console-config\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.552096 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.552078 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ff180c2-aeff-48f6-823b-62c92051e703-console-oauth-config\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.552248 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.552230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ff180c2-aeff-48f6-823b-62c92051e703-console-serving-cert\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.558033 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.558015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk2hr\" (UniqueName: \"kubernetes.io/projected/9ff180c2-aeff-48f6-823b-62c92051e703-kube-api-access-xk2hr\") pod \"console-788648b64b-vjdwd\" (UID: \"9ff180c2-aeff-48f6-823b-62c92051e703\") " pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:35.722407 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:35.722344 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:36.053538 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:36.053516 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-788648b64b-vjdwd"] Apr 22 20:04:36.055181 ip-10-0-135-72 kubenswrapper[2578]: W0422 20:04:36.055154 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ff180c2_aeff_48f6_823b_62c92051e703.slice/crio-a9e906bece5215a09e00fddf59f03cb96834d963f04bee77452682276d11dc6c WatchSource:0}: Error finding container a9e906bece5215a09e00fddf59f03cb96834d963f04bee77452682276d11dc6c: Status 404 returned error can't find the container with id a9e906bece5215a09e00fddf59f03cb96834d963f04bee77452682276d11dc6c Apr 22 20:04:36.547525 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:36.547490 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-788648b64b-vjdwd" event={"ID":"9ff180c2-aeff-48f6-823b-62c92051e703","Type":"ContainerStarted","Data":"2d5a3c7e3986206a780316363f5174423990913ab584cb2e4231bebe2b397791"} Apr 22 20:04:36.547525 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:36.547526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-788648b64b-vjdwd" event={"ID":"9ff180c2-aeff-48f6-823b-62c92051e703","Type":"ContainerStarted","Data":"a9e906bece5215a09e00fddf59f03cb96834d963f04bee77452682276d11dc6c"} Apr 22 20:04:36.567626 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:36.567583 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-788648b64b-vjdwd" podStartSLOduration=1.5675699889999999 podStartE2EDuration="1.567569989s" podCreationTimestamp="2026-04-22 20:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:04:36.565870444 +0000 UTC m=+411.096592452" watchObservedRunningTime="2026-04-22 20:04:36.567569989 +0000 UTC m=+411.098291999" Apr 22 20:04:45.723178 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:45.723131 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:45.723178 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:45.723178 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:45.727713 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:45.727686 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:46.583103 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:46.583073 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-788648b64b-vjdwd" Apr 22 20:04:46.633015 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:04:46.632978 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fc7856f6d-84ncp"] Apr 22 20:05:11.654010 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:11.653894 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-fc7856f6d-84ncp" podUID="57575df9-d3dc-4c92-9e55-31a28b27a20c" containerName="console" containerID="cri-o://cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2" gracePeriod=15 Apr 22 20:05:11.897044 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:11.897018 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fc7856f6d-84ncp_57575df9-d3dc-4c92-9e55-31a28b27a20c/console/0.log" Apr 22 20:05:11.897192 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:11.897087 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:05:12.037079 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.036993 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-oauth-config\") pod \"57575df9-d3dc-4c92-9e55-31a28b27a20c\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " Apr 22 20:05:12.037079 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.037053 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnk9r\" (UniqueName: \"kubernetes.io/projected/57575df9-d3dc-4c92-9e55-31a28b27a20c-kube-api-access-pnk9r\") pod \"57575df9-d3dc-4c92-9e55-31a28b27a20c\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " Apr 22 20:05:12.037294 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.037087 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-oauth-serving-cert\") pod \"57575df9-d3dc-4c92-9e55-31a28b27a20c\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " Apr 22 20:05:12.037294 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.037112 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-trusted-ca-bundle\") pod \"57575df9-d3dc-4c92-9e55-31a28b27a20c\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " Apr 22 20:05:12.037294 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.037131 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-config\") pod \"57575df9-d3dc-4c92-9e55-31a28b27a20c\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " Apr 22 20:05:12.037294 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.037150 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-serving-cert\") pod \"57575df9-d3dc-4c92-9e55-31a28b27a20c\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " Apr 22 20:05:12.037294 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.037164 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-service-ca\") pod \"57575df9-d3dc-4c92-9e55-31a28b27a20c\" (UID: \"57575df9-d3dc-4c92-9e55-31a28b27a20c\") " Apr 22 20:05:12.037674 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.037631 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "57575df9-d3dc-4c92-9e55-31a28b27a20c" (UID: "57575df9-d3dc-4c92-9e55-31a28b27a20c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:05:12.037764 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.037676 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-config" (OuterVolumeSpecName: "console-config") pod "57575df9-d3dc-4c92-9e55-31a28b27a20c" (UID: "57575df9-d3dc-4c92-9e55-31a28b27a20c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:05:12.037764 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.037705 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-service-ca" (OuterVolumeSpecName: "service-ca") pod "57575df9-d3dc-4c92-9e55-31a28b27a20c" (UID: "57575df9-d3dc-4c92-9e55-31a28b27a20c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:05:12.037764 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.037725 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "57575df9-d3dc-4c92-9e55-31a28b27a20c" (UID: "57575df9-d3dc-4c92-9e55-31a28b27a20c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:05:12.039530 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.039495 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57575df9-d3dc-4c92-9e55-31a28b27a20c-kube-api-access-pnk9r" (OuterVolumeSpecName: "kube-api-access-pnk9r") pod "57575df9-d3dc-4c92-9e55-31a28b27a20c" (UID: "57575df9-d3dc-4c92-9e55-31a28b27a20c"). InnerVolumeSpecName "kube-api-access-pnk9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:05:12.039530 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.039501 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "57575df9-d3dc-4c92-9e55-31a28b27a20c" (UID: "57575df9-d3dc-4c92-9e55-31a28b27a20c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:05:12.039704 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.039683 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "57575df9-d3dc-4c92-9e55-31a28b27a20c" (UID: "57575df9-d3dc-4c92-9e55-31a28b27a20c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:05:12.138205 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.138167 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-oauth-config\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:05:12.138205 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.138197 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pnk9r\" (UniqueName: \"kubernetes.io/projected/57575df9-d3dc-4c92-9e55-31a28b27a20c-kube-api-access-pnk9r\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:05:12.138205 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.138207 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-oauth-serving-cert\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:05:12.138205 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.138215 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-trusted-ca-bundle\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:05:12.138554 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.138225 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-config\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:05:12.138554 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.138234 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57575df9-d3dc-4c92-9e55-31a28b27a20c-console-serving-cert\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:05:12.138554 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.138242 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57575df9-d3dc-4c92-9e55-31a28b27a20c-service-ca\") on node \"ip-10-0-135-72.ec2.internal\" DevicePath \"\"" Apr 22 20:05:12.663948 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.663920 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fc7856f6d-84ncp_57575df9-d3dc-4c92-9e55-31a28b27a20c/console/0.log" Apr 22 20:05:12.664398 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.663958 2578 generic.go:358] "Generic (PLEG): container finished" podID="57575df9-d3dc-4c92-9e55-31a28b27a20c" containerID="cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2" exitCode=2 Apr 22 20:05:12.664398 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.663990 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fc7856f6d-84ncp" event={"ID":"57575df9-d3dc-4c92-9e55-31a28b27a20c","Type":"ContainerDied","Data":"cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2"} Apr 22 20:05:12.664398 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.664033 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fc7856f6d-84ncp" Apr 22 20:05:12.664398 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.664047 2578 scope.go:117] "RemoveContainer" containerID="cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2" Apr 22 20:05:12.664398 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.664034 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fc7856f6d-84ncp" event={"ID":"57575df9-d3dc-4c92-9e55-31a28b27a20c","Type":"ContainerDied","Data":"401d0529f26396e5b33bfee1387d8a5b82d721ed9e8106c9ca786f6959b482ab"} Apr 22 20:05:12.677947 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.677911 2578 scope.go:117] "RemoveContainer" containerID="cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2" Apr 22 20:05:12.678456 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:05:12.678423 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2\": container with ID starting with cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2 not found: ID does not exist" containerID="cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2" Apr 22 20:05:12.678521 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.678479 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2"} err="failed to get container status \"cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2\": rpc error: code = NotFound desc = could not find container \"cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2\": container with ID starting with cebd97f61229d66683476367cacb6861b002786b6516108b35cd110c354d46c2 not found: ID does not exist" Apr 22 20:05:12.684499 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.684465 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fc7856f6d-84ncp"] Apr 22 20:05:12.687301 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:12.687275 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-fc7856f6d-84ncp"] Apr 22 20:05:14.047078 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:05:14.047040 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57575df9-d3dc-4c92-9e55-31a28b27a20c" path="/var/lib/kubelet/pods/57575df9-d3dc-4c92-9e55-31a28b27a20c/volumes" Apr 22 20:07:45.952151 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:07:45.952114 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/2.log" Apr 22 20:07:45.953424 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:07:45.953404 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/2.log" Apr 22 20:09:32.051439 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:32.051401 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5"] Apr 22 20:09:32.052041 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:32.051843 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57575df9-d3dc-4c92-9e55-31a28b27a20c" containerName="console" Apr 22 20:09:32.052041 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:32.051861 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="57575df9-d3dc-4c92-9e55-31a28b27a20c" containerName="console" Apr 22 20:09:32.052041 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:32.051956 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="57575df9-d3dc-4c92-9e55-31a28b27a20c" containerName="console" Apr 22 20:09:32.053879 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:32.053861 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5" Apr 22 20:09:32.056107 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:32.056080 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hjbmb\"" Apr 22 20:09:32.062231 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:32.062204 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5"] Apr 22 20:09:32.064198 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:32.064183 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5" Apr 22 20:09:32.193791 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:32.193768 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5"] Apr 22 20:09:32.196383 ip-10-0-135-72 kubenswrapper[2578]: W0422 20:09:32.196347 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5eee0d5_20fc_4ef7_9383_b6b499dc986b.slice/crio-f8e70fde1491007817a5c59d9f4d247fd675852c47454503847a3335338e8df2 WatchSource:0}: Error finding container f8e70fde1491007817a5c59d9f4d247fd675852c47454503847a3335338e8df2: Status 404 returned error can't find the container with id f8e70fde1491007817a5c59d9f4d247fd675852c47454503847a3335338e8df2 Apr 22 20:09:32.198070 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:32.198055 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:09:32.510162 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:32.510116 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5" event={"ID":"e5eee0d5-20fc-4ef7-9383-b6b499dc986b","Type":"ContainerStarted","Data":"f8e70fde1491007817a5c59d9f4d247fd675852c47454503847a3335338e8df2"} Apr 22 20:09:33.514204 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:33.514120 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5" event={"ID":"e5eee0d5-20fc-4ef7-9383-b6b499dc986b","Type":"ContainerStarted","Data":"50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a"} Apr 22 20:09:33.514578 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:33.514321 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5" Apr 22 20:09:33.516085 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:33.516065 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5" Apr 22 20:09:33.528072 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:09:33.528034 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5" podStartSLOduration=0.560568862 podStartE2EDuration="1.528022139s" podCreationTimestamp="2026-04-22 20:09:32 +0000 UTC" firstStartedPulling="2026-04-22 20:09:32.198181562 +0000 UTC m=+706.728903549" lastFinishedPulling="2026-04-22 20:09:33.165634835 +0000 UTC m=+707.696356826" observedRunningTime="2026-04-22 20:09:33.526866883 +0000 UTC m=+708.057588893" watchObservedRunningTime="2026-04-22 20:09:33.528022139 +0000 UTC m=+708.058744148" Apr 22 20:10:57.095062 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.095030 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-d0c94-predictor-68c744c676-9sbm5_e5eee0d5-20fc-4ef7-9383-b6b499dc986b/kserve-container/0.log" Apr 22 20:10:57.356566 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.356492 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5"] Apr 22 20:10:57.356759 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.356721 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5" podUID="e5eee0d5-20fc-4ef7-9383-b6b499dc986b" containerName="kserve-container" containerID="cri-o://50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a" gracePeriod=30 Apr 22 20:10:57.593721 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.593702 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5" Apr 22 20:10:57.791476 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.791397 2578 generic.go:358] "Generic (PLEG): container finished" podID="e5eee0d5-20fc-4ef7-9383-b6b499dc986b" containerID="50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a" exitCode=2 Apr 22 20:10:57.791476 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.791452 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5" event={"ID":"e5eee0d5-20fc-4ef7-9383-b6b499dc986b","Type":"ContainerDied","Data":"50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a"} Apr 22 20:10:57.791476 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.791466 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5" Apr 22 20:10:57.791708 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.791485 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5" event={"ID":"e5eee0d5-20fc-4ef7-9383-b6b499dc986b","Type":"ContainerDied","Data":"f8e70fde1491007817a5c59d9f4d247fd675852c47454503847a3335338e8df2"} Apr 22 20:10:57.791708 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.791506 2578 scope.go:117] "RemoveContainer" containerID="50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a" Apr 22 20:10:57.800228 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.800206 2578 scope.go:117] "RemoveContainer" containerID="50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a" Apr 22 20:10:57.800484 ip-10-0-135-72 kubenswrapper[2578]: E0422 20:10:57.800463 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a\": container with ID starting with 50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a not found: ID does not exist" containerID="50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a" Apr 22 20:10:57.800550 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.800496 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a"} err="failed to get container status \"50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a\": rpc error: code = NotFound desc = could not find container \"50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a\": container with ID starting with 50c5bd05462367eaf6c31b981bf5067bad79484102c847c67029fc566aac688a not found: ID does not exist" Apr 22 20:10:57.810313 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.810262 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5"] Apr 22 20:10:57.811760 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:57.811740 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-d0c94-predictor-68c744c676-9sbm5"] Apr 22 20:10:58.045946 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:10:58.045868 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5eee0d5-20fc-4ef7-9383-b6b499dc986b" path="/var/lib/kubelet/pods/e5eee0d5-20fc-4ef7-9383-b6b499dc986b/volumes" Apr 22 20:12:45.977004 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:12:45.976970 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/2.log" Apr 22 20:12:45.980732 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:12:45.980709 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/2.log" Apr 22 20:17:37.805929 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:37.805897 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xpbrx_88e758f9-14ca-4081-b67d-e9de91d6ddf6/global-pull-secret-syncer/0.log" Apr 22 20:17:37.962886 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:37.962854 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-r64fz_16b4200b-7937-4b41-acdc-2d428d40a524/konnectivity-agent/0.log" Apr 22 20:17:38.067887 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:38.067787 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-72.ec2.internal_761d5be20dc0ed38f4bc469fd088da76/haproxy/0.log" Apr 22 20:17:41.555858 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:41.555829 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-fvkt4_e05b401e-86d3-4f58-ba83-8727ba2b2682/cluster-monitoring-operator/0.log" Apr 22 20:17:41.715561 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:41.715537 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8wksb_8f49321b-5acd-4547-be8f-3070921da9ea/node-exporter/0.log" Apr 22 20:17:41.737559 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:41.737532 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8wksb_8f49321b-5acd-4547-be8f-3070921da9ea/kube-rbac-proxy/0.log" Apr 22 20:17:41.757600 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:41.757575 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8wksb_8f49321b-5acd-4547-be8f-3070921da9ea/init-textfile/0.log" Apr 22 20:17:43.560373 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:43.560341 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-5lsr2_f0a675ce-7b40-4a85-8869-d492c9e0218d/networking-console-plugin/0.log" Apr 22 20:17:43.995668 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:43.995643 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/2.log" Apr 22 20:17:44.000006 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:43.999983 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/3.log" Apr 22 20:17:44.399594 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:44.399545 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-788648b64b-vjdwd_9ff180c2-aeff-48f6-823b-62c92051e703/console/0.log" Apr 22 20:17:44.429498 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:44.429446 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-b4vgs_fb5e4789-8a1a-445c-aeaf-e55b1a760fb5/download-server/0.log" Apr 22 20:17:44.827365 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:44.827289 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-zmn8h_93e25588-349f-4ceb-bdeb-d27b0e71e171/volume-data-source-validator/0.log" Apr 22 20:17:44.972467 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:44.972432 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs"] Apr 22 20:17:44.972797 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:44.972784 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5eee0d5-20fc-4ef7-9383-b6b499dc986b" containerName="kserve-container" Apr 22 20:17:44.972874 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:44.972799 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5eee0d5-20fc-4ef7-9383-b6b499dc986b" containerName="kserve-container" Apr 22 20:17:44.972910 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:44.972891 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5eee0d5-20fc-4ef7-9383-b6b499dc986b" containerName="kserve-container" Apr 22 20:17:44.975913 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:44.975888 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:44.978531 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:44.978501 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mv695\"/\"openshift-service-ca.crt\"" Apr 22 20:17:44.979048 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:44.979031 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mv695\"/\"default-dockercfg-kkmqp\"" Apr 22 20:17:44.979120 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:44.979079 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mv695\"/\"kube-root-ca.crt\"" Apr 22 20:17:44.986294 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:44.986268 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs"] Apr 22 20:17:45.049994 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.049946 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q24j9\" (UniqueName: \"kubernetes.io/projected/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-kube-api-access-q24j9\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.050176 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.050032 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-sys\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.050176 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.050053 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-proc\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.050176 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.050080 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-lib-modules\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.050176 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.050097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-podres\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.151294 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.151253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-lib-modules\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.151294 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.151293 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-podres\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.151534 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.151318 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q24j9\" (UniqueName: \"kubernetes.io/projected/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-kube-api-access-q24j9\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.151534 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.151362 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-sys\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.151534 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.151379 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-proc\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.151534 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.151446 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-proc\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.151534 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.151460 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-podres\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.151534 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.151489 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-sys\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.151534 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.151460 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-lib-modules\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.159552 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.159513 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q24j9\" (UniqueName: \"kubernetes.io/projected/4b91b67d-9bfc-4cf1-802c-a9eb7ed08706-kube-api-access-q24j9\") pod \"perf-node-gather-daemonset-fjkrs\" (UID: \"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.285686 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.285645 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:45.417683 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.417655 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs"] Apr 22 20:17:45.420642 ip-10-0-135-72 kubenswrapper[2578]: W0422 20:17:45.420609 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4b91b67d_9bfc_4cf1_802c_a9eb7ed08706.slice/crio-12798aa78e90f7a9222613c71478716d389d883f243dab6f205c5d8184169f2b WatchSource:0}: Error finding container 12798aa78e90f7a9222613c71478716d389d883f243dab6f205c5d8184169f2b: Status 404 returned error can't find the container with id 12798aa78e90f7a9222613c71478716d389d883f243dab6f205c5d8184169f2b Apr 22 20:17:45.422516 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.422501 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:17:45.565935 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.565889 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rgmkt_3b42faf8-dfc9-477e-a74b-abcef44beb8e/dns/0.log" Apr 22 20:17:45.585277 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.585248 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rgmkt_3b42faf8-dfc9-477e-a74b-abcef44beb8e/kube-rbac-proxy/0.log" Apr 22 20:17:45.650207 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:45.650170 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qzz8z_f4ad43cf-a292-44ff-a1ae-9d139860c9cc/dns-node-resolver/0.log" Apr 22 20:17:46.000696 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:46.000666 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/2.log" Apr 22 20:17:46.004921 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:46.004894 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-lzbxz_10902c6d-77cd-4e80-86e7-28633566a0ee/console-operator/2.log" Apr 22 20:17:46.072603 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:46.072570 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pjssp_8501acc2-dabe-4f52-9b02-ba92e386acb7/node-ca/0.log" Apr 22 20:17:46.143448 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:46.143411 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" event={"ID":"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706","Type":"ContainerStarted","Data":"190596fbe07e8a9623597e7f06efd6a2a0750e3b493aeb87c8d81d263efd69aa"} Apr 22 20:17:46.143448 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:46.143452 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" event={"ID":"4b91b67d-9bfc-4cf1-802c-a9eb7ed08706","Type":"ContainerStarted","Data":"12798aa78e90f7a9222613c71478716d389d883f243dab6f205c5d8184169f2b"} Apr 22 20:17:46.143702 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:46.143475 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:46.159518 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:46.159462 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" podStartSLOduration=2.15944444 podStartE2EDuration="2.15944444s" podCreationTimestamp="2026-04-22 20:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:17:46.158298353 +0000 UTC m=+1200.689020373" watchObservedRunningTime="2026-04-22 20:17:46.15944444 +0000 UTC m=+1200.690166450" Apr 22 20:17:46.760934 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:46.760899 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-748d997cd4-p2hd6_cd91dcdf-472d-4758-b11b-7e7b6d347fbd/router/0.log" Apr 22 20:17:47.071233 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:47.071159 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8zwhn_54257b88-b9cb-44e6-9885-78eb59be8c12/serve-healthcheck-canary/0.log" Apr 22 20:17:47.458885 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:47.458822 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-nbb5m_c79df87a-c734-4d84-b239-5cfbd4266788/insights-operator/0.log" Apr 22 20:17:47.459200 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:47.459182 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-nbb5m_c79df87a-c734-4d84-b239-5cfbd4266788/insights-operator/1.log" Apr 22 20:17:47.539554 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:47.539525 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fxfbt_0f7244af-0f0f-4ec7-9a81-fd42125846a4/kube-rbac-proxy/0.log" Apr 22 20:17:47.561988 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:47.561959 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fxfbt_0f7244af-0f0f-4ec7-9a81-fd42125846a4/exporter/0.log" Apr 22 20:17:47.582464 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:47.582431 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fxfbt_0f7244af-0f0f-4ec7-9a81-fd42125846a4/extractor/0.log" Apr 22 20:17:49.570643 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:49.570602 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-665c47d676-47g8t_220fa352-1edc-488c-acc4-5d90d6a02793/manager/0.log" Apr 22 20:17:49.609961 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:49.609931 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-nfz97_2fa8c9a2-6512-4791-906f-3f30b6597438/server/0.log" Apr 22 20:17:49.727711 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:49.727679 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-xlkfm_f8ee8283-a178-4e05-8ef0-bef0bae22044/seaweedfs/0.log" Apr 22 20:17:52.157534 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:52.157505 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-fjkrs" Apr 22 20:17:53.251482 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:53.251440 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-9ggs9_e7f9c124-ab6b-4350-ae69-f352a89d7122/migrator/0.log" Apr 22 20:17:53.279743 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:53.279701 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-9ggs9_e7f9c124-ab6b-4350-ae69-f352a89d7122/graceful-termination/0.log" Apr 22 20:17:53.630125 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:53.630067 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-hdqk8_99eb7f40-e81e-4454-b333-f70327da668c/kube-storage-version-migrator-operator/1.log" Apr 22 20:17:53.631167 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:53.631147 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-hdqk8_99eb7f40-e81e-4454-b333-f70327da668c/kube-storage-version-migrator-operator/0.log" Apr 22 20:17:54.967100 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:54.967068 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zh622_b7a6e97e-64c6-44de-9b0f-622a7b3a2316/kube-multus-additional-cni-plugins/0.log" Apr 22 20:17:54.988440 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:54.988401 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zh622_b7a6e97e-64c6-44de-9b0f-622a7b3a2316/egress-router-binary-copy/0.log" Apr 22 20:17:55.007222 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:55.007196 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zh622_b7a6e97e-64c6-44de-9b0f-622a7b3a2316/cni-plugins/0.log" Apr 22 20:17:55.026817 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:55.026789 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zh622_b7a6e97e-64c6-44de-9b0f-622a7b3a2316/bond-cni-plugin/0.log" Apr 22 20:17:55.046729 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:55.046699 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zh622_b7a6e97e-64c6-44de-9b0f-622a7b3a2316/routeoverride-cni/0.log" Apr 22 20:17:55.066036 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:55.066011 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zh622_b7a6e97e-64c6-44de-9b0f-622a7b3a2316/whereabouts-cni-bincopy/0.log" Apr 22 20:17:55.085558 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:55.085534 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zh622_b7a6e97e-64c6-44de-9b0f-622a7b3a2316/whereabouts-cni/0.log" Apr 22 20:17:55.165786 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:55.165706 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-m22tm_9c8b3da8-9112-4c9d-abf4-97a17bb2a3ab/kube-multus/0.log" Apr 22 20:17:55.272488 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:55.272462 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xbxhx_802bd93c-03cf-435c-a223-487ff037f6c7/network-metrics-daemon/0.log" Apr 22 20:17:55.303058 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:55.303032 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xbxhx_802bd93c-03cf-435c-a223-487ff037f6c7/kube-rbac-proxy/0.log" Apr 22 20:17:56.076534 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:56.076498 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ncpz_bebc2174-0145-4f91-b0a3-c497f508c693/ovn-controller/0.log" Apr 22 20:17:56.100342 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:56.100314 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ncpz_bebc2174-0145-4f91-b0a3-c497f508c693/ovn-acl-logging/0.log" Apr 22 20:17:56.118298 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:56.118266 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ncpz_bebc2174-0145-4f91-b0a3-c497f508c693/kube-rbac-proxy-node/0.log" Apr 22 20:17:56.140710 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:56.140679 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ncpz_bebc2174-0145-4f91-b0a3-c497f508c693/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:17:56.161111 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:56.161084 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ncpz_bebc2174-0145-4f91-b0a3-c497f508c693/northd/0.log" Apr 22 20:17:56.183723 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:56.183694 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ncpz_bebc2174-0145-4f91-b0a3-c497f508c693/nbdb/0.log" Apr 22 20:17:56.208642 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:56.208558 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ncpz_bebc2174-0145-4f91-b0a3-c497f508c693/sbdb/0.log" Apr 22 20:17:56.305090 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:56.305061 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ncpz_bebc2174-0145-4f91-b0a3-c497f508c693/ovnkube-controller/0.log" Apr 22 20:17:57.815824 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:57.815779 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-f7ckn_b0d5f54b-81d2-4910-b1b2-87b6f74fa261/check-endpoints/0.log" Apr 22 20:17:57.879800 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:57.879772 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rktp2_0aaf6153-a940-4bb7-9f56-61f82d60b50d/network-check-target-container/0.log" Apr 22 20:17:58.774541 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:58.774512 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rd9fc_64dcc192-4f40-4fa7-bb9c-1dacc5985c26/iptables-alerter/0.log" Apr 22 20:17:59.412772 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:17:59.412742 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-5x5fq_4c893932-7c81-4353-821d-dd67be4edf70/tuned/0.log" Apr 22 20:18:01.105773 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:18:01.105738 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-6vcjz_1ac0aac7-46c9-42f6-8aaa-e626360e1faa/cluster-samples-operator/0.log" Apr 22 20:18:01.120541 ip-10-0-135-72 kubenswrapper[2578]: I0422 20:18:01.120517 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-6vcjz_1ac0aac7-46c9-42f6-8aaa-e626360e1faa/cluster-samples-operator-watch/0.log"